Our phones rarely leave our sides. They travel from pocket to hand to tabletop to bed–the entire time measuring our movements, tracking how we touch them, and listening to our voices.
One example takes the location data of a device and maps it, with each vertical line representing one ping for location, and the width of that line corresponding to how accurate that location is. The result is a constantly scrolling, brightly lit mosaic of blues and magentas and oranges. In another GIF, Albrecht uses data captured by a device’s microphone and maps it based on frequency over time, creating a visualization that has the grainy quality of an X-ray and giving it the aura of a far older sensing technology.
The visualizations, which were part of the recent exhibition Machine Experience at Harvard, also translate information from the microphone, gyro sensor, motion sensor, and touch screen into striking GIFs. For Albrecht, who works as a researcher at Harvard’s Metalab, they are a representation of how our machines see us–and to the artist, that vision is unsettlingly monotone.
“Seeing, hearing, and touching, for humans, are qualitatively different experiences of the world; they lead to a wide variety of understandings, emotions, and beliefs,” Albrecht writes on the project’s website. “For the machine, these senses are very much the same, reducible to strings of numbers with a limited range of actual possibilities.”
Ultimately, we must remember that machines understand the world in binary–and that the more we mediate human experience through that binary, the more limited it may become.