He has an oval face, a Roman nose, honest–no let’s make it confident eyes, and a relaxed appearance. Sounds like a perfect late night hookup material on Tinder. But in reality? It’s her drone’s next target.
This is the expectations-bending experience that is fiftyEight, a final project by RCA graduate Joanne Harik. It’s an imposing machine positioned as an interactive dating game. You select attributes that you find attractive–never mind that the machined metal dials and piano black console looks like something out of a Cold War bunker. The machine offers you nine matches to your profile.
Select the one that you think fits your dating criteria, and then, the twist: you’re not going on a date. You’re actually on a mission to assassinate this person, and you have 58 seconds to aim a drone their way, or change your mind, and take out someone you think matches the profile better (or, let’s be honest, maybe looks less beautiful?).
“This twist of context in the experience aims to highlight the fact that whether matchmaking, shopping, or hunting for terrorists, algorithms are based on similar logic but with a huge difference in applications and outcome,” says Harik. “Google, Amazon, Tinder all profile us, but we don’t stop and think about it. We don’t think of the algorithms that do it as offensive. But in the kill list scenario, the outcome from the relationship between the data analyst and the algorithm is life-altering.”
Indeed, few people realize that the charming face-identifying tech of companies like Facebook was actually born from military research. Or that the algorithms that match us with potential mates are not so different than those that select military targets. And with a simple shift in intent, those algorithms can go from classifying terrorists for automated drone strikes to hooking up two drunken college kids on a Saturday night.
That’s one reason that Harik sourced everyday people, rather than terrorist photos, for the project. She filmed 150 different people in the U.K., Lebanon, Italy, the UAE, and the U.S. to build fiftyEight‘s fictional target list specifically to defy stereotypes of terrorists. Some smile a bit, others scowl. But it’s hard to imagine anyone as criminal, let alone planning an event of mass terror.
Harik wants her audience to stew in this chilling mix of precision and ambiguity inside the data-constructed persona. The name itself, fiftyEight, is inspired by the 58 days it takes U.S. authorities to order a drone strike on a suspected, algorithmically generated terrorist target–targets that, by some reports, mistakenly lead to numerous civilian deaths.
“We tend to consider technology as a neutral point of reference, putting our trust into it. But algorithms are programmed by experts that unintentionally inject their bias into the system, also often remolded to correspond to people’s aggregated preferences and opinions,” says Harik. “The ubiquity of algorithm-dependent technologies surely facilitates our lives. But is this dependency causing us to over trust flawed systems?”
Indeed, whether it’s Tinder on your phone or the drone flying overhead, we’re surrounded by calculated bias. And the most bothersome part of it all is that we don’t necessarily self-correct from that bias when we have the choice. “I have already noticed that quite a few players who have selected a beautiful woman as a match . . . went instead for a bearded male target [as a kill],” says Harik. “It would be interesting to see if this bias would be a trend.”
All Photos: via Joanne Harik