One photo on display as part of James Bridle’s new solo exhibition at Berlin’s Nome Gallery, which opened on Friday, will undoubtedly inspire some empathy with the self-driving car. Foregrounding Mount Parnassus in Greece, the car stands frozen in place in the center of a parking lot. It is circled in two rings of salt. One dotted line, on the outside, has invited it in. But a solid line, on the inside, is prohibiting it from going out.
Thanks to the logic of its machine-learning mind, the car is trapped—restricted by these two conflicting rules of the road and lacking the autonomy to override them.
“[The photo] uses the bits of our environment that we share with the car,” he says. “There’s a common language that originates with the human” so he or she can see what the machine sees.
Indeed, looking at the photo you can’t help but feel bad for a car that’s entrapped by its own learned behavior. Practically speaking, Bridle says, it is a classic example of a machine learning technique that puts the machine in adversarial situations so it will learn how to overcome them in the future. Bridle has been been using artificial intelligence in his work for over a decade—he’s known for his futuristic, technologically progressive works, as well as coining the term “New Aesthetic“—but he says machine learning has opened up even more terrain for his artistic work.
Using open-source machine intelligence software like Google’s TensorFlow and self-driving car software Comma AI, Bridle is training his car to become autonomous and has opened up the code he has developed to the public on Github. While Bridle says he will never actually ride in a self-driving car that uses software written by him (“that would be a suicide wish”), he’s interested in what he is learning while training it. In the process, he has also develop an Android app for tracking movement, speed, and steering angle for neural network training.
The real purpose of the Autonomous Trap 001 piece, however, is more conceptual than practical. The salt ring is not meant to aid in his pursuit to develop a self-driving car. Rather, he wants to explore what happens when potentially dangerous or disruptive technology is introduced to society. “The two responses I explore here are learning to do it yourself, or working actively to resist it,” he says.
In other words, in our dystopian, machine-led future, humans may soon need to learn how to assert themselves over their vehicles. If you don’t learn code, you should at least learn how to boobie trap it.