Transparency is a Neoliberal Remedy for Technology Problems

Bianca Wylie
7 min readJun 14, 2024

--

or: A Deputation on the New Toronto Police Service (TPS) Service Procedure 17–14 Artificial Intelligence Technology

On May 31st, I did my first deputation to the Toronto Police Services Board. I was notified of the meeting by Ushnish Sengupta (thank you again Ushnish), as he and many others have been tracking and intervening on the Toronto Police Services approach to the use of technology for years now. Thomas Linder did a great deputation, which is available in this video of the meeting, and the question session that follows the deputations, led by Councillor Shelley Carroll, is well worth a listen. Our deputations start at about 2 hours 47 minutes into the meeting, the video of which is here.

Before sharing a bit of commentary on my deputation and its transcript - What is the Toronto Police Services Board, exactly? From the City’s website:

“Established in 1834, the Toronto Police Service (TPS) is the police force servicing Toronto and includes approximately 8,000 full-time and part-time uniform and civilian members. TPS members include frontline police officers, criminal investigators, community response officers, parking enforcement officers, communications operators, court officers, civilian specialists, and support staff.

The Toronto Police Service Board’s mandate is legislated by the Community Safety and Policing Act (CSPA) and, includes general management and setting of policing policy. Generally, the board’s role in shaping the structure of policing is very broad, limited by legislation only in the realm of day-to-day operations.

Board responsibilities

The board is responsible for:

  • the administration of the police service
  • ensuring that adequate and effective policing is provided in Toronto
  • establishing policies, strategic plans, and annual reports as prescribed under the Community Safety and Policing Act
  • employing members of police service
  • adopting a diversity plan for members of the service
  • conducting an annual review of the chief of police’s performance
  • monitoring the chief of police’s handling of discipline
  • ensuring police facilities comply with prescribed standards
  • giving directions to the chief of police in accordance with the Community Safety and Policing Act and regulations made under the act.

Board size and composition: the board consists of 7 members composed of:

The transcript of my deputation is below (lightly edited for readability and um reduction). In the days that followed giving that deputation I had two conversations that circled on just how little transparency can ever functionally accomplish as a regulatory or policy approach in the technology realm, and beyond too. Which is funny to me, because it was part of what I requested more of in my deputation. Narrowing in so tight on transparency is a very particular way to completely ignore everything that one has to then contend with from that starting point of transparency: from history to power dynamics to an ongoing lack of accountability with currently open and transparent procedures.

Transparency is a neoliberal remedy for technology problems because it not only allows a context-less way of thinking about power, it encourages it. When taken as too important as a stand-alone measure, transparency asserts a naive assessment of they way existing tables are set in terms of public time, money, capacity, and so on. Neoliberalism’s cultural tendency is to accelerate (and ignore) the impacts of ever more narrow and siloed ways of thinking and doing. Transparency without ten other bigger and harder things to accomplish (access to justice anyone) is a fairly hollow method.

I share these thoughts because it’s a good example of how no one part of the approach to more public power and control is nothing, but also how vital it is not too put too much stock in any one way of resolving problems. Transparency as a regulatory approach in technology is being asked to hold far too much, given its shortfalls.

I both think transparency is an important thing to always press for, but also so marginally part of the larger puzzle that it always has to be put into its appropriately sized box (read: small). Transparency on process/procedure is likely more fertile ground (vs. product or service transparency), but this also runs into cultural challenges and a lack of comfort in operating as openly as any public entity should. See also: open government.

Below are my remarks, which include pointing out the mass theft of public data upon which much of the technology in question has been built, and the small irony of that in the context of a police board meeting.

Deputation — May 31, 2024

Thank you for the opportunity to join you.

I just have a few thoughts and I think a couple of requests. And I say this as someone who’s been looking at the regulation of software and artificial artificial intelligence and governance.

Just to try to zoom us out for a minute, I want to say that it’s great that The Toronto Police Service have been on this topic for a couple years already. But one thing to make clear on all the policy and procedure is that internationally and nationally there’s a lot of still confusion on how to approach regulation.

One of the things that I think is very germane to this gathering is that a lot of the basis of artificial intelligence and machine learning is fundamentally built on - for lack of a better term here - mass theft of people’s data. A lot of use of data that there was never appropriate consent for. I know that’s a bit you know, way up, but I don’t think it should be lost on us. It’s sort of incoherent with a lot of the other things that are part of thinking about police operations. So let’s just pin that to the side.

Also want to share that because that is still a that’s a feature of what people are trying to understand. If we need to do access to justice or think about regulation — what do you do if the foundation of the model or the product or the or the tool is itself up to challenge? I just want us all to be aware that we’re not standing on stable ground in terms of the basis of the construct of the creation of these technologies, and that’s an internationally nationally recognized problem. So just want to leave that there, because we have to think about what we’re using, and how to be accountable for it.

It’s nice to hear this thinking of how to think more generically about technology because I think artificial intelligence is going to be a moving target. It’s an attractive marketing term right now, because there’s a lot of money related to it. As soon as it’s too hot, and it’s getting pretty hot, it’s going to be called something else. And so you really have to be careful from a definitional perspective. And I really like the thinking here that you’re thinking about software, thinking about automation, and thinking about technology a bit more broadly. Because I think the work you’ve done here, you might want to be able to pull it back a bit and into the future for things that might not be called artificial intelligence. So that’s helpful.

Which brings me to close on the specific to the procedure that we see before us today. In terms of supporting access to justice and people’s access to their rights.

It’s really well known that people need to know a tool is being used in order to have any way to say, Hey, I think my rights might not have been respected. And so I just want to really double down on the idea that transparency is so critical here, and that anytime even this mention of something covert.. it’s so important, because when people are trying to figure out right now in the regulatory and legislative spaces, how to regulate. We’re really dealing in these contexts where people don’t know that a tool is being used on them. There is no path to access to justice. It can’t be walked if someone doesn’t know [the use of] a tool is in motion. So this really gets down to the exemptions that are referenced here. Which I would assume follow on from the policy.

In terms of “Product is covert in nature”. So you’re going to speak to the type of product — well, that might not be adequate for someone to be able to understand and study whether there’s a rights case to be made in terms of how that product or service was developed. So just to keep that in mind. I’m sure you’ve thought about it before.

The reason I’m saying this to you again now is that speaking to people just in the last month, this is really the way that people in legislative circles are trying to think about how we make sure people have a path for access to justice, and transparency keeps coming up again and again, it is imperfect but absolutely fundamental. And so really would encourage more specificity here if it’s possible.

And the same with the second exemption — these suggestions are both referring to clause or section 24, on the fourth page of the report — that public disclosure of the product or details of the product, as per MFIPPA would stop full disclosure.

We really have to think about instead of going to the low bar here and being compliant, I think we have to think about overacting - being so proactive on disclosure, because this is really the bleeding edge of how people are trying to think about regulation.

I understand there’s been a lot of thought on this already. I applaud it and I’m grateful for the work being done publicly. But the more documentation and the more proactive disclosure over and above minimum requirements is really going to help everyone out here. Thank you.

screenshot from: https://www.youtube.com/live/wO-d9OBD8wM

--

--