“It’s really unsatisfying to have to design something inside a computer screen,” says Madeline Gannon, a designer and member of experimental collective Madlab.
This dissatisfaction is what motivated Gannon to explore computational design methods that extend beyond what we traditionally think of as computers. Her project Tactum, a collaboration with Autodesk Research, moves design for 3-D printed accessories off the screen and onto the body. Using projection mapping and Kinect cameras people can adjust the designs directly on their skin using intuitive gestures.
To make sure Tactum would produce accessories that properly fit, Gannon created a 3-D model of an arm in the computer and set parameters for the elements of the design that were non-negotiable. For example, a watch band needs to have enough space for the watch face to sit and the clips that attach the face need to be placed exactly. These requirements are built into the program, and can’t be changed. What the user can do is shape the rest of the design, pinching it closer together or pulling it in whatever direction they like.
“You give agency for the designer to create and agency for the design tool to create,” Gannon says. When the design process is done, the accessory is exported as a printable file which can be sent to a 3-D printer, no more editing necessary. Thanks to the algorithm, whatever choices the designer makes, the piece will be both wearable and functional as a watch band.
Right now, Tactum is just a prototype, but it has big implications for the future of personalization in consumer products. Though programs to design something on your body are great for fashion, they could also revolutionize the making of medical equipment like splints. Gannon believes that as sensors proliferate, we’re going to see this style of creation becoming more common. “Once [sensors] become cheap and scalable, I think we’ll have a more plastic relationship between what’s happening in the physical and digital environments,” she says.