Trust Not In Tech
Never. But the Handle of Trust is Good to Grab to Talk about Truths.
The “trust” and tech conversations — much like ethical, good, etc —can get inane quickly. But talking about trust in this moment of democratic confusion is both helpful and necessary. Like many things in the tech space, it’s a highly useful place to grab the handle and use it to get much tighter on what we mean when we say trust in tech, and to pry open adjacent conversations, to talk about tactics for building trust, why we need them, and more.
Next week, a very thoughtfully organized set of conversations (thanks Ana and Jess:) ) will make space for these and other discussions at a free event called AI in the City: Building Civic Engagement and Public Trust. If you can join us on Feb 10, please do. As that conversation grows nearer, I’ve been preparing, and here are a few of the thoughts it has prompted.
- There should never be trust in the state. In this moment, there is a persistent discussion, in our spaces as well, about rights and digital rights. And there is also, particularly with AI, a lot of conversation about accountability. But rather than think about trying to make tech accountable or to add more rights or reshuffle the social contract, it makes sense to think about the work it takes to hold the state accountable for delivering on the rights we already hold. This can be called access to justice. There is a lot of this work ongoing, but the point here is to acknowledge and accept that this work is never “done”. There is no law or policy or set of change to advocate for that will then have set up earned trust in the state. There is significant work, therefore, in building the persistent capacity to hold the state accountable. Always. To make that practice constant, and to direct energy to the growth of relationships and tools to enable it.
- The state, to be just, must offer up its operations for accountability. I will always identify myself as an open government advocate because it is not a hopeless cause to believe in a better state and in better public institutions. And I must be careful not to minimize or erase the thousands of public servants that uphold and commit to serving the public dutifully and ethically. The point here, and this ties to technology directly, but to so much else too, is that the only way we can constantly improve public institutions is by knowing what they are doing and how. This is why the centralization of power in governments, emergency orders, and so many other ongoing trends, are anti-public. They evade historical checks on power and do so in the name of the public when they are generally anything but pro-public. So here, the suggestion is that for any subject, whether software or solitary confinement, there is protocol written down. There is process. There are administrators. There are unionized employees. There are ombudspeople. These things are not entirely buried yet, and the more we can agitate for the details of each small piece of the machinery, the more we can draw it out into public view for conversation, examination, comparison to laws, potential legal action, union action, journalism and more. Let’s consider how the movement for trust in technology and a telling for transparency applies to the broader systems and processes of government. When we do this, we can move attention fluidly in and amongst a much broader range of subjects than software.
- Trust in software is about stripping context which is part of why it’s a shell game. The problem with trying to vest trust *in* software, like any other tool, is that you can never say how it is going to be used. You can never know how it is going to be modified. This is why the site for the development of trust has to be in relation to the people and processes that use the tools, not the tools themselves. There is a significant industry of software and AI auditing that is growing and I know some very thoughtful people involved in this work and without looking deeply into it, I can guess that the best ones will be auditing well beyond the tool and deeply into the organization that is using it. This is another example of using the word and its intent to shift into adjacent spaces and processes and protocols. So back to the context, there is a flip side to this context piece, which may be a pointer for the open government work.
- Where can software/AI be deployed in a pro-social way, to create efficiencies for public(s) seeking accountability? I’m a long-time fan and volunteer for a project called The Computational Democracy Project, which is about supporting the use of pol.is in democratic process. I was introduced to this crew a long time ago by Patrick Connolly (thank always patcon :)). One of the things that I’ve known and lived with for a while is that there is a significant under-investment in public engagement processes and capacity building for self-governance. This makes sense because investing in these things shifts power. Pol.is is an interesting case of the use of AI/software because it seeks to help people engage in discourse at a larger scale than can be pulled off in a physical setting. As someone that was trained in, and remains committed to, the practice of face-to-face dialogue and conversation, both for public engagement with the state and for the growth of self-governance capacity, I’m also deeply aware of the limitations and constraints it can impose. These are mostly good limitations and constraints because one to one communication is rife with misunderstandings, and communication at scale is tricky as hell and fraught. BUT if we being to explore the ways we can use software differently, for pro-social purpose, we might find some interesting outcomes and these are as important to our future work as the defensive work of mitigating and abolishing harmful applied mathematics. Take a read here of cases of pol.is in practice. Beyond pol.is, I think often about how to create the right units of efficient engagement. How can we use automation and software to pull, spider, flood, engage etc. the parts of our public systems that have been built around administrative burden? How do we apply tools to reduce secrecy? How to we create more ease for engagement at the right points? Tabs Toronto is another great small example, an easy way to get notified when issues you care about go to city council. There is space here to review the civic tech tools of 1.whatever (which often bit off more than they could chew) and think about civic tech 1.1.whatever as much more intentionally focused on accountability. This also relates to the open government work of doing systems auditing, trying to uncover how the status quo operates.
- Trusting each other is the hardest part. We all know both how hard trust is to come by, and how beautiful it is when it functions. Trust creates speed in appropriate places. Lack of trust creates friction, which is where we are right now with the state, it’s where we’ve always been, and this should be understood as a constant condition, not a problem to be solved. I don’t trust the state because I see how it has both created current conditions and has also pretended they can’t be solved. These current conditions, built as our world is, on colonialism and slavery, have accelerated climate crisis, and now we stare at a seemingly overwhelming global governance problem. It’s not overwhelming when we trust each other to work adjacently. It is overwhelming when we think we can or should all agree. This is a broader set of questions to think about — to understand how to function together but separately, in collective work but not in identical work. To trust that many concurrent ways are necessary to be in better stead next month and next year and until we’re dead. Seeking universal methods and tactics and systems feels to me a dead end. I’m glad people hold anti-capitalist views. I’m also glad people are working on transition, on the from here to there parts. Trusting each other will grow the more we get to interact with each other and understand our difference. This is part of abolition. Self-governance and the freedom to set and reset agendas. To work on this all persistently and to shift the goals we have for ourselves collectively in our norms, the place that is stronger and more powerful than laws and policies. Our culture around trust can be built on truth but truths are many and we need more arenas to get into them together. This is getting abstract so enough for this morning, my fingers are cold. Hope to see you next week ❤