Now imagine if AI weren’t just in the background, but instead acted like an equal collaborator in the design process?
To explore this idea, Toronto-based design research company Radical Norms worked with a computer vision algorithm, powered by a Google API, to design a chair. For the project, called 100% Chair, the designers attached two-dimensional silhouettes of chair parts to a rotating machine. A camera watched the rotations and a computer vision algorithm assessed the percentage probability that it was looking at a chair.
The camera took screenshots of each composition and tagged it with what percentage of chair the computer saw–one image might be only 6% chair, while another might be 42% and a third could be 96%.
The designers then used these screenshots almost like design briefs, with the goal of designing a physical chair that would match the percentage in the image. One of their prototypes looks something like a stool, with a long spindly black leg jutting out from a plywood seat–it’s 78.2% chair. Another, which the computer reads as 24.6% chair, is very low to the ground, with one hefty wooden leg and a very tall, thin yellow one.
This wasn’t just a wacky thought experiment. By turning AI a design collaborator, Radical Norms hopes to demystify a sometimes obscure technology. “People tend to be afraid of these unknown processes, AI being one. We [think] mainly it’s because of a lack of understanding of how they operate,” says Daniel Daam-Rossi, a cofounder at Radical Norms who’s currently an artist-in-residence at the cultural institute Harbourfront Center in Toronto. “We said, let’s bring it to the forefront, and collaborate with it like we would another designer or another human person. It starts this dialogue and takes us on the road of actually understanding what these things are.”
For the designers, the project helped them understand that an AI is really just a pattern detection machine based on a dataset–and how that dataset is categorized makes all the difference. For instance, when they first started playing around with the rotation machine and the computer vision-enabled camera, the designers placed the machine against a white surface. But the algorithm couldn’t detect anything, because every image in the dataset was set against a real-world background. “It’s a complex process where [the outcome is] mixed with the bias of millions of people, with our own bias, which is influencing this moment in time where we’re deciding what is a chair and what is not,” says Koby Barhad, cofounder at Radical Norms and assistant professor in industrial design at OCAD University.
Ultimately, the designers have a more ambitious goal. In a not-too-distant future, robots will inhabit many of our spaces, including domestic ones. Today’s datasets are based on flat images of objects like chairs, but one day they’ll be able to to incorporate views of chairs from all angles. According to the team’s third co-founder and OCAD University professor Angelika Seeschaaf Veres, Radical Norms wants to design chairs that will be recognizable in that environment. And there is a significant real-world advantage: The designers hope to create a small-scale production batch of AI chairs, all made of chair offcuts that would be wasted otherwise.
While the project is highly conceptual, it points toward a future where AI will be an integral part of the design of everyday objects–but it won’t be working in the background. Instead, the algorithms will guide the design process itself.