advertisement
advertisement

Transparent Design Is A Matter Of National Security

This week, a group of experts from MIT exhorted President Trump to make our digital infrastructure stronger–and design featured prominently.

Transparent Design Is A Matter Of National Security
[Source Images: Flickr user SparkFun (photo), m_pavlov/iStock (pattern)]

A group of MIT researchers wants President Trump to know just how immensely fragile and hackable the country’s infrastructure really is. In a new report published on Tuesday, they call on Trump to take “immediate action” on threats to everything from internet infrastructure to the electrical grid. One notable concern amongst many raised in the 50-page report? That the software that controls infrastructure is too complex and opaque for humans to understand–essentially, a dangerous design flaw.

advertisement
advertisement

For the past year, MIT’s Computer Science and Artificial Intelligence Laboratory and its Center for International Studies has held workshops with experts from across a range of infrastructure-critical fields, such gas, finance, energy, communications, and more. They were trying to get a sense for the biggest risks in each type of infrastructure. One issue that appears again and again is the fact that software and systems are so complicated, we can’t tell when they’ve failed.

[Source Images: Wavebreakmedia/iStock (photo), m_pavlov/iStock (pattern)]
In finance, one participant talked about designing algorithms that human users could actually understand. We must “learn how to interrogate algorithms,” the report contends. “Absent the ability to do so, algorithms would become increasingly autonomous and beyond human control.” In a another workshop with experts on communications and network security–essentially, internet infrastructure–participants said that systems needed to be less opaque, that “no failure should be silent to the operator.” MIT sums up the problem succinctly with a volley of research questions that the government needs to consider:

Can a system be designed so that its failure would be immediately transparent to its operator? Can the state of the system’s algorithms be made understandable to humans? Would it be cost-effective to impose audit requirements on that kind of system? (E.g., if a driverless car ran off a bridge, could its control algorithm be made to explain why it did that?) If so, why don’t we mandate that kind of auditability in critical sectors?

One thing that’s not clear is whose responsibility transparent design should be. Should the private companies that control infrastructure simply hold themselves to a loose standard? Should the government create laws that regulate it? Many experts have argued that even basic “algorithmic literacy,” or education about what algorithms are, how they work, and how they impact society, is an absolute necessity today. Until we all understand the implications of these systems, the issue of regulating them won’t draw enough attention from the public.  

We’ve heard unsettling anecdotes along these lines before: That some algorithms are so complicated, their creators don’t even understand why they act the way they do. It’s an eerie idea, but one we’ve heard mostly discussed in terms of consumer products like apps and social networks. MIT’s report drives home the notion that the problem isn’t just applicable to the products we use on a voluntary basis. Opaque design, especially when it comes to algorithms, threatens the macro-level systems on which nearly everyone in the country depends.

The biggest–and maybe most painful–challenge the report calls out? Getting anyone to care. “The question the nation faces is therefore this,” the report states. “Are we condemned to remain in this unstable and insecure condition, in which the best we can do is to repeat urgent but futile warnings from high places and, at the operational level, merely to refine our tactics in a losing game of Whac-A-Mole?” 

It’s a question that many experts, from climate science to national security, probably find themselves asking these days. Thoughts? Solutions? Screeds? Send us an email at codtips@fastcompany.com.

advertisement
advertisement

About the author

Kelsey Campbell-Dollaghan is Co.Design's deputy editor.

More