advertisement
advertisement
advertisement

In This Immersive “DataVRse,” You Can Explore Big Data Using Your Senses

What if you could physically interact with all the data that we consume every day?

Neo Mohsenvand thinks part of the key to understanding big data might be making better use of the human mind, and not relying as much on algorithms. Mohsenvand, a research assistant at MIT Media Lab, is building a machine that he has branded the “DataVRse”: a physical space where people can strap on a VR headset and use their human senses to explore it and find patterns.

advertisement

“What we are doing is basically we are synthesizing new environments which are basically data-driven,” he says. “You step in a room while wearing your virtual reality goggles, and you’re in a new environment that is constructed from the data. You get to develop some sort of gut feeling for what is normal in that environment.”

Inside the DataVRse, for example, medical researchers might explore data from patients through sound, taking advantage of how humans process noise through each ear.

“Just the pattern of me walking in the data, without me even knowing, it’s solving a very hard machine learning problem,” he says. “Because I always tend to face the source of the sound . . . Those paths that we take, by walking in data, it turns out that they efficiently cluster and partition the data.”

In an analysis of Trump followers on Twitter (red) and Clinton followers (blue), researchers could clearly see the bubbles each lived in. Journalists (yellow) had almost no overlap with Trump followers.

Researchers could, for instance, encode the heartbeat of patients as sound, and the location of those points could correspond to differences in their genomes.

“If I’m a heart surgeon, by just walking in that space, in that direction that I feel something abnormal is coming from that direction, I can identify a whole system of genes that take part in that disease,” he says. “So removing a lot of UI and analytics and statistics and mathematics that is involved in data analysis, and basically using the human brain in the right way, we can use our own faculties to process large amounts of data very quickly.”

It’s a much richer medium for data than a basic visualization or any other current media, which can quickly become cluttered and overwhelming. “One of the problems with the future world is that the amount of information is growing exponentially,” says Mohsenvand. “So it is getting harder and harder to understand what we actually have in terms of information. All of the previous tools that we had– simple visual tools, writing, symbols–they’re not really sufficient and efficient to help us understand what’s going on.”

advertisement

As the researchers build the new system, they’re beginning with a focus on data about food, for several reasons, both abstract and practical. The food chain is a fundamental organizing principle of the world. Food helps us defeat entropy. Food is likely part of what led human ancestors to learn to collaborate; cooking food made us human. Food data–from chemical composition to consumption patterns–is highly complex, making it ripe for analysis.

On a practical level, studying food is a way to understand, for example, how people make decisions. “Every time I want to eat, I have to make decisions which are basically an immersive physicalization of data,” he says. “We are hoping to find very interesting patterns about the dynamics of society, the dynamics of economy.”

To look at health issues, one of the things the researchers want to study is the large patterns about food that can be found in tweets, Instagram posts, and other social media, helping use that data to untangle the complex systems that govern why people choose to eat particular foods.

They’re also monitoring news about food–for example, how the news about bacon and cancer affected bacon sales (while sales temporarily dropped, they actually went up shortly afterward, in a pattern that the researchers say looked similar to addiction–seeing news about bacon may have triggered people to crave it more).

The system is still under development, and the researchers hope to release it in the second half of 2017 or in 2018. The hope to build a deep learning engine inside the system that let users make virtual tools, on the fly, filled with data-driven intelligence. “We call the idea ’embodied machine learning,'” Mohsenvand says.

The next step, they say, is to optimize the DataVRse for people who are completely unfamiliar with the data they’re experiencing. “This is not really a tool for highly specialized people,” says Mohsenvand. “It’s a tool for very ordinary people. We actually imagine in the future we can have jobs that use immersive tools, because these immersive tools don’t need any education. They just need a person to use their simple vision and auditory and haptic skills–the things we are all equipped with, that we’ve evolved over tens of millions of years.”

About the author

Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley.

More