The Trouble with ArriveCAN lives in older unresolved politics of public tech
I’m still doing background reading to understand and validate the origin story of ArriveCAN. The point of this writing and efforts to draw attention to the app is to compel the government to shift its use from mandatory to voluntary. Please see this thread for actions you can take.
Today’s post is about three things: 1. ArriveCAN and the administrative state 2. The role of the Public Health Agency of Canada vs. the role of Canada Border Services Agency and 3. the role of the federal privacy commissioner in this and other public technology decisions.
ArriveCAN and the Administrative State
You may have heard the saying that we’re “sleepwalking into a surveillance society”. The launch of ArriveCAN helps explain the role of the state in this equation. If you were to ask many government of canada employees about ArriveCAN they would likely tell you that they are trying to save you, and all of us, time at the airport. That this is not only well within their mandate, but that this is what people want them to do. This is an efficiency argument. Some people agree with it. It is used often to introduce technology in the name of reducing burden. There are times where the argument makes a lot of sense.
What we also know, in 2022, is the number of equity-related issues that are part and parcel of digitization. What is happening with ArriveCAN is a government accelerating digital transformation while pretending not to know about those they are harming or putting at heightened risk with this acceleration. But they do know. And here we turn to metrics used in this case to explain how they get away with it.
There are numbers that are used to defend against the use of the mandatory app. The numbers relate to percentages of people with smartphones, with access to the internet, minutes and seconds saved, volume of people crossing borders. Number of people contacted under the Quarantine Act. These are the numbers of the rational administrative state. They are put into communications and get repeated in the press. They form the narrative and they all sound rather mundane and defensible.
What do we not have numbers to measure? Emotional experiences. Fear. Uncertainty. Lack of trust. Lack of oversight. Lack of governance. Lack of information about the app. Future purpose. Impact on rights and people that have struggles at the border. All of these stories are either delayed or ignored in the administrative narrative frame.
This is a complex topic, but it brings up a question we have not grappled with adequately on societal level. The extension of the administrative state through modernization and digital transformation is made to seem mundane but its current and future impacts are anything but.
There is political power in embracing inefficiency as an ideology to resist what is being done through a mundane rationale. Resisting in this case is as simple as requiring the app to be voluntary, to make sure there are alternate channels and modes to use. Building these up over time is an investment in our future choices to define how our public services work, by the people that provide them and the shape and nature of that work.
Public Health Agency of Canada vs. Canada Border Services Agency
The combination of these two institutions in this situation is problematic for a few different reasons, some intuitive, some less so. There is a somewhat unstated, and unknown to most of us, culture within government to have deference to Health Canada/Public Health Agency of Canada. This generally makes sense. It does not make sense in terms of defining technology choices. So the less intuitive problem we have going on with this app is that while it’s formally owned by Canada Border Services Agency, the mandatory element of its use was a requirement driven by the Public Health Agency of Canada in support of the Quarantine Act.
Does Canada Border Services Agency have an ethical and public service duty obligation to challenge the use of this technology as mandatory? I would argue yes, but I am also well aware of the culture and history of the organization. But under the cover of public health rationale, their institution is now amassing a high volume of users of the app without having to go through the usual trust building and voluntary adoption process. This is how the Public Health Agency of Canada is being a handmaiden of something for the Canada Border Services Agency that they may or may not realize in terms of future consequences. And while the requirement of mandatory app is coming from one part of government, the implementation is happening in another one — the IT realm of Canada Border Services Agency.
Here we get into another unintutive part. How the culture of the public service combined with technology projects enabled/enables the mandatory use of ArriveCAN decision. The best designers within the state defend against its worst. This is not happening here. What happens in the public service is a pressure for public technology to exist within the same metrics as consumer technology. Shiny flashy look at us we can be in the app store top rankings too kind of thinking. It’s hard to believe, I know, but this is genuinely a significant element of how these technology incidents occur.
There is a deep cultural and structural issue related to perverse incentives in the public service and technology. There was a celebratory kind of tone amongst some of the senior public service involved in the app about how “successful” it is. Seemingly oblivious to the fact that people don’t have any choice but to use it. In practice, this means public servants celebrating the app being high on the downloads list in the app store. This metric is being used to celebrate “success” of a mandatory app being used during a public health crisis.
There are clearly problematic incentive structures in play within the government in terms of what success looks like for public technology. It is not straightforward to try to address this from the outside. A possible consequence of moving this app from mandatory to voluntary use, as seen through the lens of those in charge of it, is a reduction in its use. To them this is a negative outcome. Senior public servants are not currently incentivized to support the correct technology stewardship approach.
The government is undermining itself again on the public tech front with ArriveCAN. To deal with this, and future versions of it, we’ll need more public service administration focus. These problems live and die in public service operations. They will not be solved in legislation. And they will definitely not be solved through adding more public sector computer science capacity.
The Role of the Privacy Commissioner
This is but a short note to flag something troubling, but not entirely surprising. In 2020, the federal privacy commissioner, along with other provincial and territorial commissioners, issued a joint statement about pandemic tech. They were focused primarily on COVID Alert, but their caution extends quite generally and ties back into the administrative state rationale. They said that technology should be voluntary, not mandatory, in order to build public trust.
However, in a recent order paper, the privacy commissioner apparently had no concerns with the app (see below). Nevermind that two columns over there sits a clear issue to be concerned about — “ArriveCAN has no estimated end date”. There is nuance here because what matters is the mandatory use, not the voluntary use, but all the same. The lack of pulling the mandatory use component out and addressing it is of concern.
There are questions here aplenty. Is this another case of legal confusion where mandatory data collection, permissible via the Quarantine Act, has been conflated with the mandatory use of technology? It appears that this is the case, but who, on behalf of the public, is supposed to be taking this issue up with the government if the privacy commissioner does not seem to think it is their problem? Who should have told Public Health Agency of Canada no, we’re not making this mandatory. Who, or what process, should have created that friction?
What we can see here is that systems set up to provide public oversight for data and information (ie: privacy commissioners) can’t deal with techno-determinism. They’re past — in a linear sense — the decision point where we need to be intervening a lot more actively to stop/shape implementation of tech. This is a policy and political stance, this is not a right or wrong kind of conversation. This is why we need to consider and understand if and how we may want to participate in a politics informed by inefficency as refusal, and how to best advance these arguments, rather than getting mired in administrative rationale.