The “recommended for you” boxes that appear on sites like Amazon and Netflix can be so accurate it’s eerie. Sometimes they manage to present exactly the thing you were looking for–or the thing you didn’t know you were looking for until you saw it.
Inspired by the pervasive and sometimes creepy effect of the algorithms that power these recommendation ads, Scott Kelly and Ben Polkinghorne–a pair of advertising executives who moonlight as artists–created a series of signs that bring these suggestions into the real world. “Perhaps you’re worried that as your life moves online and Alexa moves into your living room, your decisions are essentially being made for you,” the duo writes about the project. “Perhaps you’re worried you live in a bubble. Perhaps you’ve never thought about it.”
The project, called Signs of the Times, consists of large signs at four popular locations in the pair’s home country of New Zealand, each recommending similar locations to visit. One, set against the volcano Mount Taranaki, suggests Mount Cook, the obelisk atop One Tree Hill, and the massive rock Uluru as similar locales. Another stands by the ocean in Back Beach, New Plymouth, recommending Matapouri Bay, the Mamanuca Islands, and Erawan Falls.
The signs aren’t meant to help tourists decide where to go next. According to Kelly and Polkinghorne, the project is meant to provoke–to encourage visitors to think about how they make decisions in the age of Amazon. Did you chose to buy this speaker because an ad presented it to you at the right moment? Or because that’s the speaker you really wanted?
Kelly and Polkinghorne say that they decided which places to include on each sign by making assumptions–particularly the sign on a playground in Westown, New Plymouth. “If somebody was going to be visiting a playground, perhaps they’re a child,” Kelly says. “What other things would they be interested in?” The pair decided to suggest McDonald’s, which Kelly loved as a kid.
But their decision to base the recommendations on assumptions has an interesting resonance with the algorithms you’ll find online. At least at first, those algorithms use the assumptions of the person who wrote them. Then, they rely on the data of users to determine whether the algorithm’s assumptions are correct, and they’re updated if not. But, fundamentally, those assumptions remain deeply encoded, influencing what you see and even what you buy.
The duo hopes that their signs, absurd as they might appear in the real world, will expose the pitfalls of letting an algorithm dictate what you watch, what you see, and even what news you read.