During that harrowing week in July when two black men were killed by police, the nation erupted into protests, and a sniper assassinated five Dallas police officers, the events were covered by digital, social, print, and TV media. But the form of documentation people will remember most is the livestreaming: the Facebook Live video of Orlando Castile’s death, and the chaotic scene at the Dallas protests captured on video.
“It is incredibly powerful to tune in to a livestream that had 80,000, 90,000 viewers, and to watch a scene in real time through someone’s eyes,” says Gordon Mangum, who focused on livestreaming as part of his graduate studies at MIT’s Center for Civic Media, were he graduated in June. But as useful and exciting as livestreaming is for raw, on-the-scene reportage, it also lacks context–which can make it confusing or easily misconstrued. DeepStream, a tool Mangum developed along with two colleagues at MIT, aims to solve that problem with a platform that allows users to embed information about a live video alongside it as it plays.
DeepStream is like a cross between annotation sites like Genius and the Twitter sidebars embedded into some websites. Users log into the platform, now in beta, and have the option to search for livestreams or videos from a variety of platforms like YouTube Live, Facebook Live, Twitch, and Meerkat (the latter you need a login for). Choosing a video will load it in the browser, then users can add news stories, original text, tweets, and Soundcloud files. When selected, the information is added into so-called “context cards,” which stack up just to the right of the video as it plays. If you are watching a video of a Black Lives Matter protest, for example, one context card might include a link to an article defining the movement and another a tweet from activist. Taken together, the cards should provide context to what is happening in the video.
With DeepStream, livestreams can still be used to break news, but online publications will be able to add information along with the video stream embedded on their sites. Besides online media, Mangum says it could also be used on TV; broadcast journalists could show the streaming video alongside corresponding tweets and additional news as it comes in during the program. DeepStream has already partnered with several groups to develop use cases, such as Witness, an organization that trains and supports people using video to expose human rights violations. At the Rio Olympics, Witness used DeepStream to document people who resisted eviction being removed from their homes in the favelas, using a mix of videos from its own videographers on the ground, as well as live streaming found online, to tell stories in real time.
The website Fusion is another test case for DeepStream. The site used it internally during a Democratic presidential debate last Spring, says Mangum, allowing journalists to add tweets and comments to the livestream while they watched remotely with colleagues. Mangum says that while he and his team developed the tool for journalists, they can also see it being useful for educational videos, live sports and concerts.
DeepStream still in public beta, and Mangum has no hard date for when the product will be fully finished. For now, they’re still iterating on the design, experimenting with the side window versus screen ratio, and optimizing for mobile. Until then, they hope to get as many individuals and organizations using it as possible to explore various ways people use it that they haven’t even thought of.
“I think we’re entering into a new phase of how people share information with each other,” Mangum says. “And while [livestreaming] is a powerful experience, it doesn’t capture some really important aspects of story. We wanted to create a tool to tell richer and more complex stories, and at the same time connect with what’s happening all around the world.”