If you work in software or design in 2016, you also work in politics. The inability of Facebook’s user interface, until recently, to distinguish between real and fake news is the most blatant example. But there are subtler examples all around us, from connected devices that threaten our privacy to ads targeting men for high-paying jobs.
Digital services wield power. They can’t be designed simply for ease of use–the goal at most companies and organizations. Digital services must be understandable, accountable, and trusted. It is now a commercial as well as a moral imperative.
Power and politics are not easy topics for many designers to chew on, but they’re foundational to my career. I worked for the U.K.’s Government Digital Service for five years, part of the team that delivered Gov.uk. I set up the labs team at Consumer Focus, the U.K.’s statutory consumer rights organization, building tools to empower consumers. In 2007, I cofounded the Rewired State series of hackdays that aimed to get developers and designers interested in making government better. I’ve also worked at various commercial startups including moo.com and ScraperWiki.
The last piece of work I did in government was on a conceptual framework for the idea of government as a platform. “Government as a platform” is the idea of treating government like a software stack to make it possible to build well-designed services for people. The work involved sketching some ideas out in code, not to try and solve them upfront, but to try and identify where some of the hard design problems were going to be. Things like: What might be required to enable an end-to-end commercial service for buying a house? Or what would it take for local authorities to be able to quickly spin up a new service for providing parking permits?
With this kind of thinking, you rapidly get into questions of power: What should the structure of government be? Should there be a minister responsible for online payment? Secretary of state for open standards? What does it do to people’s understanding of their government?
Which cuts to the heart of the problem in software design today: How do we build stuff that people can understand and trust, and is accountable when things go wrong? How do we design for recourse?
Journalist Anthony Sampson published a book approximately every 10 years starting in the ’60s on the subject of power in the U.K. Each book included a diagram with bubbles denoting different spheres of power; the bigger the bubble, the more power it has. In 1962, the aristocracy was still significant enough for a mention. In the early ’80s, the “civil service” looms large. Trade unions and nationalized industries also featured.
In 2004 the center of government–the U.K.’s prime minister–is shown as much more powerful as is traditional media.
I like these diagrams because they demonstrate a couple of things clearly. First, they reduce things down to an abstract landscape of power. There is only power–be it commercial or political, accountable or unaccountable. Second, they show that power is mutable: It changes over time, and it can be changed, and that can happen quite rapidly, too (see the diminishing power of unions). But fundamentally, they show that politics is about the distribution of power in society.
In the second decade of the 21st century, digital services–code and design–are changing how power is distributed. Some examples:
- Society’s ability to regulate industries effectively is limited by its ability to access and understand code. This became clear during the VW emissions scandal, in which the company installed software on some cars to cheat emissions tests.
- Research from Carnegie Mellon University found that men were far more likely to see Google ads for high-paying executive positions than women.
- The decisions developers make about how they model ads on job-publishing platforms have an effect on people’s ability to find work (PDF).
- Debates about workers’ rights are increasingly debates about code. The employee tracking and conditions that are coded into platforms like Deliveroo and Uber are changing what it means to be employed.
- Facebook’s dilemmas around news algorithms are pretty well documented. The decisions they are faced with about censorship and quasi-regulation are the things that, historically, only nation states have had to deal with (but of course Facebook lacks the opportunities for recourse that nation states have in place–at least in the western tradition).
- Connected devices are redefining what privacy–a fundamental human right–means. And we are, in turn, being asked to trust opaque machine learning systems with that data.
- Uber is winning the battle for the future of public transport through a combination of legal brute force and amazing service design. But, long term, could we see them replacing democratically accountable civic transport networks?
So, if politics is about this distribution of power in society, software is now politics. The decisions of designers and developers are a political force of their own. And we are asking users to trust us with more data, to allow code to make more decisions for them all the time.
You could see all of this in a negative light. Run for the hills and disconnect! Reality or nothing!
I have another idea: It’s time to stop designing digital services to just be easy to use and start designing them to be understandable, accountable, trusted, and easy to use.
“It just works” is not good enough anymore if you want your users to trust you with more data and make more decisions on their behalf; if you want users to start trusting their data in your machine learning system; and if you want users to trust your device in their house. This is the fundamental design decision facing people building digital services right now.
So, we need to figure out how to wire these attributes–understandable, accountable, trusted–into the services and institutions of the digital age. It’s time for some new design patterns. (Incidentally, we’ve been here before: The manufacturing and industrial revolutions led to new institutions and practices that took the work of entrepreneurs and put it to work for society.)
These are the four areas I think deserve our attention:
1. Accountability at the point of use
Make accountability and transparency part of the design of services. Imagine if Uber made it clear exactly how much a driver earned and whether it met a living wage, directly on the email receipt. Or if Amazon made it possible to understand supply chains and environmental impact when you bought a product. Or you could access government food safety inspection data next to a delivery order from Deliveroo.
Google has started showing some work in this direction with Google News explicitly marking fact-checked articles. Facebook, too. This is only going to get more important as superficially good design abstracts how things actually work. Maybe you should be able to ask Alexa, “Did the people who assembled you have the right to paid parental leave?”
2. Expose the rules
If we want to change policy, we have to understand how code works. One obvious way is to examine the source code directly. The U.K. government increasingly opens its code. I was part of the team that drafted this a few years ago–it’s the Digital by Default Service Standard, and it includes a requirement to open up the source code of new projects. Other governments, including the United States under President Obama, have made similar commitments.
Of course not everyone can read code, and there are many circumstances in which organizations will not want to release their source code (intelligence data, for instance). Maybe we can look at other tools of the software development tool chain to help expose the rules.
Perhaps if services published their tests, it would help people understand how those services work. After all, Gherkin syntax is designed to be understandable by non-coders. Here are a couple of examples that explain the rules about free prescriptions in the U.K.:
You can’t “view source” on Siri or Google Now. But as software agents of one sort or another (bots, digital assistants, news feed algorithms) start to make more decisions for us, publishing software tests may be useful for making bots more transparent.
Tests are only one way of exposing how code works. We could also see the emergence of software deposits for private code–deposits that consumer rights organizations or government inspectors would have the right to audit. Some of this already happens in the gambling industry. For bots that use machine learning, publishing training sets–what the bots use to learn–will also become important.
3. Reimagine permissions
We are now pretty used to apps asking for permission to use our cameras or access our location. But we are not yet used to the idea of different services exchanging data about us, beyond maybe an email address. Why would we want that?
Transparency is part of it. But it also puts users in control of their data. Services providing an access history to users is the best protection organizations have against fraud and users have against misuse.
In a government context, that would mean explaining to users exactly what their data is being used for in a way that is understood.
That is not how government works at the moment. But we are going to have to figure out design patterns for exchanging new types of data in a way people understand.
This is a hard problem. It is also a moving target because of more data and more devices. We are already struggling today with just a few devices and data points.
There is a parallel here to how the tools we needed to organize the web had to change over time. Back in the late ’90s, a hierarchical list of categories was a perfectly acceptable way to organize the web for people, but as it got bigger we needed better and better search, personalized search, and machine learning-assisted search. When it comes to permission systems, we are currently the equivalent of the Yahoo home page.
4. Digital tools for digital consumer rights
For users to really trust stuff in the digital world, they need trusted third parties to do some of the hard work for them. And this means giving elbow room to some new digital watchdogs.
We are all familiar with the idea that new technology results in regulatory institutions. It took this book in the United States to change the law around car safety. In the U.K., the Consumers Association was set up to test the products of the manufacturing revolution.
What will the watchdogs of the digital age look like? Can the tools that we use to develop software become the tools of consumer watchdogs?
Environmental campaign groups might start automatically checking open government data for breaches of regulations. An app on your phone might verify the food safety rating of the restaurant you just walked into. We could see third parties actively verifying data sets and checking facts (as Facebook has done).
We need to ask ourselves: In a world of Amazon, Facebook, and Uber, do we need a global consumer rights org? Who is going to explain all this to users?
If you work in the digital industry, you are in this bubble. Like it or not, you work in politics. And questions of accountability, understanding, and trust are only going to become louder. These issues are only going to get harder to solve as we ask users for more data and to trust code to make decisions for them. The organizations that understand this and start thinking about how to make services that are accountable, understandable, and trusted will have the advantage.
This article was adapted from a presentation at O’Reilly Open Source Convention, with permission from the author. Read the original here.