Digital products touch millions of people and change the way we live — just like policies do
Recently the state of California made a small tweak in a decision-making system. This change was seemingly innocuous, but was in fact hugely consequential. The tweak was this: rather than using census tract data to prioritise vaccine resource distribution, the state used zip codes as a proxy for need and demographic distribution. This tiny decision has effected millions of marginalised people who are already bearing an inequitable share of the pain of the pandemic. The state knew that this decision would have an effect, but chose to use this data as an input because it would be operationally easier, meaning that the vaccine roll out would be faster.
The ACLU's retrospective analysis of the way a tiny choices can radiate out harm, got me thinking about the way technology functions as a proxy for policy.
Technology as a proxy for policy
With the building of technology comes the development of new policies; policies are set in the way UX is designed, the way that models are trained and reinforced, and in the way that the impacts of all these things are evaluated. And then of course, policies are born out of how those evaluations might effect future development.
Procuring, building, and using a piece of technology is a technical exercise. It is a process of encoding choices, building systems that can deliver on those political decisions, and then managing the integration of that system into the brick and mortar work of the state. Going back to that ACLU piece linked above:
"When algorithmic or other technological systems are part of public policy, the public has a right to know what information is being used and why, how they work, and what the consequences will be for every person touched by the system." Jacob Snow, ACLU
Law isn't code. That means that turning law into code requires translation. And in this translation, there is power. But this is translation without representation. The technical and design teams that do the encoding may be working to build something that respects the spirit of the law, but more important than that intent is the effect: effectively they make decisions that matter.
Essentially, the complexity and uncertainty of digital system deployment means that downstream, the 'choices' made about these systems, are policy.
I think states should take more control over the explicit development of policy into technology; they should outsource to technology only what is necessary to deliver on that policy. We need to recognise that technology is both strategy and policy. So, once the digital ball is rolling, we need to adapt governance practices to accommodate and support policy-making which is appropriately representative.
The question I've been noodling on is: how would we even do this? If we think of technology development as a part of the policy-making process, even after a policy becomes law, how would we govern it?