As a tech reporter at the Los Angeles Times David Sarno found himself frustrated that newspaper stories only engage “one lousy sense,” as he puts it. That would be sight.
Why couldn’t they be as interactive and entertaining as a video game like Grand Theft Auto, where a player can walk around a virtual city, drive a car, walk into a store (and, yes, kill people), and essentially have some control over a re-created reality?
Even when the iPad came out in 2010 (an event that Sarno prolifically covered for The Times) and made print media more touchable, Sarno wasn’t impressed. “At that time, and still largely today, what news organizations and magazines are doing is reproducing the print version on the screen,” he tellls Fast Company. “It’s like two steps better than scanning in the print version and putting it on the iPad screen.”
For the past year, as a John S. Knight Journalism Fellow at Stanford, Sarno has been working on technology that can vastly improve this scenario and, he hopes, bring news to life in a new, vibrant way. Through Lighthaus, his San Francisco-based startup, Sarno is applying video game design concepts to create touchable, interactive graphics for news stories as a way to help readers become more engaged and informed.
“The more you can give people a sense that you’ve created a reality that they can occupy and interact with, the more you’re going to engage their attention and their mind,” Sarno says. While news organizations like The New York Times are investing in multi-media platforms as a way to amplify their presentation of the news, Sarno wants to go farther to create totally immersive information environments.
Here’s how it works: For a story about, say, hydraulic fracturing (or “fracking”), Sarno created a three-dimensional graphic that allows a reader to zoom in and rotate the layers of rock and shale destined for drilling, just by touching the screen. A few more taps and the reader can take a closer look at the equipment used to extract gases from shale and simulate the seismic testing that geologists do to see where to drill. Then, by dragging a finger along the screen, the user can even plot the path of the drill.
Looking at visual experiments like The New York Times‘s Pulitzer prize-winning “Snow Fall,” Sarno says, “I would call those lightly interactive, where what you as a reader do has a slight effect on triggering different kinds of media while you’re watching. As a reader you’re not putting in that much. You’re not controlling or influencing it that much. You’re maybe scrolling or clicking to activate a video or a cool, little animation.
“But that’s not taking it to this new level of interactivity that has been established and defined by the most sophisticated video games.”
The fracking video–which was Sarno’s final project at Stanford–cost him $5,000 of his own money. (He has yet to tap investors.) While not exorbitant, it’s not cheap, and so Sarno foresees news organizations using Lighthaus’s technology a few times a year for important features, rather than on a daily or weekly basis. Ultimately, the idea is to integrate the technology into a news site’s iPad app, though Sarno says there are technical hurdles preventing that at the moment.
Still, Lighthaus has already drummed up interest. The Dallas Morning News is working with Sarno to create its own fracking graphic, and Stanford Medicine Magazine has commissioned Lighthaus to create technology to explain the medical condition known as placenta accreta.
Sarno says his goal “is not to move away from journalism, but to explore the different fields that this makes sense for.”
“And I think that health care is a natural, because so much of it is taking place in 3-D reality, and there are complicated and interesting phenomena inside the human body that really lend themselves to being represented visually and in 3-D. Oftentimes, when we’re talking to our doctors about a medical condition that we may have, we can’t always understand the technical language they’re using, and maybe it’s a stressful moment when they’re explaining it to us. So having a 3-D representation of what’s going on can help patients understand what’s going on inside their own bodies and help doctors communicate important information about treatment,” Sarno says.
As for how an ink-stained wretch made the leap into technology, for Sarno it was fairly intuitive. As the son of a librarian and a software engineer, he says “for my whole life I’ve always had the element of storytelling and technology intermingling.” Those threads merged in Sarno’s studies: He received an undergraduate degree from Yale in computer science and a graduate degree from the University of Iowa in creative writing. Working at the Times for six years honed his love of the news.
“At the Times I was writing stories about technology, and then I think at this point, doing business with Lighthaus, I’m now creating technology to tell stories. So there’s a little bit of symmetry there that is probably both unconscious and deliberate.”
[Image: Flickr user Dr. Wendy Longo]