3-D scanning is widely used to ingest physical objects such as furniture into digital files. As its name suggests, Manhattan-based startup Body Labs wants to do the same with human bodies. But unlike a coffee table, your body changes, and a scan today won’t be exactly the same as a scan made three weeks ago–which is exactly what Body Labs is counting on. By tracking your body’s subtle changes, Body Labs is exploring just one of its technology’s applications to build a happier, healthier you.
The company produces software that builds a digital body model from a 3-D scan taken by 3-D cameras. That can be an expensive proposition: The 3-D scan assembly shown in this video of Body Labs’ operations is an eight-camera model by the U.S. Army that costs upwards of $30,000.
But Body Labs is developing software for scans made by the 3-D cameras coming out on smartphones, tablets, and laptops next year. And when people use their home devices to scan themselves and upload them into BodyHub, Body Labs’ online portal, they can see for themselves how their body changes from scan to scan.
“We will be among the biggest of big data companies,” says CEO and cofounder Bill O’Farrell. “Our long-term belief is that we will be the creator and curator of everyone’s body model. And we never want to charge for it.”
Like any biometric element, a body scan is data, and bulk data are valuable. Body Labs’ biggest customer (that it can talk about) is the U.S. Army, which has a nearly $1 million deal with the company to use 12,000 body scans (two-thirds of which are male, one-third female) in order to improve uniforms and body armor. Designers took the Body Labs scan data and created new patterns to improve fit and better spread weight–patterns fitting the maximum number of soldiers with a minimum number of sizes to save cost. The Army needs to know that up to the 99th percentile of its soldiers can properly wear its armored vests or get out of a tank hatch.
“What we are doing for a lot of apparel companies is cutting down on the sample process. They still make paper patterns that they cut out and put on models. We can make a computer-aided design model instead,” says O’Farrell.
Unless you’ve got a very trained eye, it’s very time-consuming for fashion designers to test patterns on 10 or 100 bodies until they find the average human geometry, adds Body Labs cofounder and VP of product design, Dr. Eric Rachlin. And even then, they end up with averages that sit in the middle of the body range, but don’t actually resemble real bodies. They just have representative models for size 4, size 6, and size 12.
“Who’s the size 12? Well, that’s a model. And someone will see that model and say, ‘Hey, wait, nobody looks like that,’” says Rachlin. “That’s just not how the body works.”
Body Labs’ scans are currently restricted to superficial tracking of the body. But with today’s–and tomorrow’s–3-D cameras, the company is developing the software to take scans from consumer devices and turn them into models for other companies to use. It’s made several demos showing how the technology can be used in fields such as digital animation, health, and fashion. Rachlin admits it’s tempting to consider developing software for each field the startup is eyeballing, but that would take Body Labs in a limiting direction–so it chose to release an API it calls the BodyKit for other companies to play with.
“We’d love to show the world what we’re capable of, but we’re not experts in style and clothing, for example,” says Rachlin. “It doesn’t make sense for us to do it when others can build it out.”
I met O’Farrell and Rachlin at a demo night back in February when they announced the BodyKit API and offered to scan guests with their giant Army scanner. What could I do but volunteer? (And then accede to ABC reporter Tina Trinh’s request to film the scan for a segment.) Weeks later, Body Labs loaded the scan into a MakerBot 3-D printer and printed up . . . me. A tiny 6-inch, 1/100 version of me.
I brought Tiny Me to the Fast Company offices and, as expected, it (he?) was swarmed by my coworkers. Where could they get one? Is this really what I looked like? Did they scan me naked? They did not–I chose to scan in my underwear–but the crowd gawking at Tiny Me started to get uncomfortable.
This was not a carefully chosen triptych of photos for Tinder. This was not me flexing in a waist-high mirror selfie. When someone picks up Tiny Me, I have no control over how my body is viewed. I’m not framed from flattering angles or reshaped by stylish clothes: Like Hermione asks in Harry Potter and the Prisoner of Azkaban, I could not help obsessing over questions like “Is that how I really look from the back?”
I want to assure people that Tiny Me was scanned when I was weathering February by eating my feelings, and that I spent two weeks going to the gym before I demanded a second scan (you can see the comparison in the GIF below). I want to excuse how I look because, in a way, the honesty of a 3-D body scan is a nightmare for people with body anxiety–which is everyone, including me.
But there is something liberating about seeing your weird body in the objective, honest curves of a 3-D scan–and imagining how similarly everyone sees their own weird body. When we all get scanned and have our own detailed set of body dimensions, it will not matter that I keep failing to fit into old shirts: I will upload my stats to an online vendor and it will print me out a shirt that sits right on me and wrong on nearly everyone else. I will get scanned morning and night to see how my body shifts through the day. And finally we will all see that there are not size 0 women and medium men–there’s an infinite spectrum of weird, great bodies.
Rachlin was a graduate student in 2010 when he met professor Michael Black, who specializes in computer vision. O’Farrell, a graduate of Brown University, where Black taught, had started and sold several companies, including the Company of Science and Art, which created the After Effects special-effects software that is now sold by Adobe. Black brought O’Farrell back to meet Rachlin and Alex Weiss, the latter of whom would become the fourth cofounder and VP of software development.
Black was appointed to the Max Planck Institute of Intelligent Systems in Germany in 2010 with a $5 million grant to continue his research on 3-D scanning. With nonexistent demand for 3-D body biometrics and work left to do on its software, Body Labs decided to remain in development with cells in New York and Germany.
But since then, two things happened: An ecosystem emerged around implementing 3-D, and folks became obsessed with biometrics. Fitness trackers became a thing. In gaming and entertainment, Microsoft released the Kinect in 2010 and updated it in 2014; Facebook bought Oculus Rift in March 2014. The market for 3-D scanning erupted overnight as Occipital’s Structure Sensor external iPad scanner blew through its Kickstarter goal in a day in September 2013, topping out at 1200% funded with $1.2 million raised. Now, Google’s Project Tango has leaped into the fray and we should expect multiple-camera stereoscopic 3-D in our smartphones, tablets, and laptops as early as next year, O’Farrell says.
Consumer demand for both biometrics and 3-D capability has risen while the Body Labs team has been hunkered down to fine-tune its software–but now that the tech is feasible, is the company afraid of anyone catching up and competing with cheaper solutions?
“The scan is just a point cloud. While it’s something you could do with just any scanner, I think it’s complicated to take a scan and make it useful. We’re hopeful that between the technology we’ve developed, the training we’ve had, and the patents we’ve filed, we’ll keep our head start,” says O’Farrell.
Then there’s the security question: Is Body Labs afraid of getting hacked?
“We’re aware of the security need when hosting biometrics like you would any sensitive data. Could we be hacked? Sure. If the Pentagon can get hacked, sure,” says O’Farrell. “We’re hyperaware, but the value is just not the same as bank account data.”
“It’s sensitive, but are the body scans more revealing than Facebook data?” says Rachlin.
Body Labs is already hosting around 2,500 body scans on its BodyHub repository, which is currently restricted to Body Labs’ business-to-business contracts. But over time, that will change, says Rachlin, and they hope that companies making apps from their BodyKit API will build apps on top of the BodyHub portal. Ideally, an individual with a BodyHub account would be able to go to a personal trainer and the trainer could use their scan data to target specific parts of the body to track changes over time as the individual continues uploading scans to BodyHub.
Gyms, health professionals, and fitness apps have a lot of potential to use Body Labs’ scan data (“Can you imagine when Equinox has it?” says O’Farrell), but the first field to harness its body scans will probably be clothing and consumer fashion, says Rachlin. Indeed, online clothing retailer Woodies has already built Body Labs scan tech into its website, allowing shoppers to fill in their own measurements to get a custom-made and fitted T-shirt. More retailers could follow in as soon as the next couple months, says Rachlin.
There are lucrative avenues in entertainment, too. Rachlin believes that 3-D scanning will enhance character creation in video games, letting players scan themselves into the action. But more exciting is the possibility to animate those 3-D models. With accurate scans and software refined to reflect proper movement of muscles and parts, animators could animate body models directly instead of the laborious process of building skeletal wireframe models that work with a physics engine and then creating skins to put on top of the wireframes. While this would probably require tons of man-hours creating new physics engines and software, it could be faster and cheaper than the current motion-capture process.
Body Labs has played around with rudimentary animation, taking existing body models scanned in and changing their poses. It’s difficult to get the bodies to behave like humans do. “It’s about getting motion and quantifying the jiggly bits of the body, so to speak,” says Rachlin. “The model’s going to get more sophisticated over time, and in the future, I definitely think that we’re going to be able to look inside and see the fat distribution and skeletal system.”
At the Max Planck Institute, Body Labs researchers were able to take a version of their body scanner into the gym and record how the body bounced in terms of health. Body fat and muscle are distributed in a particular way and should move in a particular way: If you punch a part of the stomach and it doesn’t move, that’s unhealthy, says Rachlin.
And then there’s health. Body Labs just released a consumer-facing demo product, a widget for people to input their scans and calculate their body mass index (BMI), a common metric for roughly calculating weight against the population’s average.
“The reason people are obsessed with weight is that it’s something you can measure. We want something better for people to think about, and ideally better than the BMI,” says Rachlin. “We want people to think in terms of body shape and body geometry, its shapes and curves. The principal way in which body shapes vary is the kind of health tracking we had in mind.”
Similar to 23AndMe’s practice of selling anonymized chunks of biometric data to companies and researchers, Body Labs is considering selling anonymized sets of body scans. But at the moment, it doesn’t need to worry about being self-sustaining.
“It doesn’t make sense for us to be profitable. It makes sense for us to grow,” says Rachlin. “With emerging devices demonstrating the potential of the 3-D world, this is a good time to build more opportunities for consumers to make bodies.”