This story may sound familiar: A tech company, basking in the glow of viral user growth, becomes an investor darling. Over the span of just a few months, the company’s valuation doubles. More than that, it becomes a cultural phenomenon, beloved by users of all ages and flooded with examples of feel-good creativity.
Today, the problem platform is Zoom, but there’s a reason this narrative feels like déjà vu. It’s because we’ve seen it before with YouTube, Facebook, and every other Silicon Valley platform that has spent the past decade monetizing users while evading regulation. Moreover, we’ll see it with every platform that follows, unless we finally make policy changes that protect users and hold platforms accountable.
There were problems with Zoom well before our current moment. A lifetime ago, way back in 2015, Pennsylvania courts sent a man to prison for broadcasting his rape of a 6-year-old boy to viewers on Zoom. Federal prosecutor Austin Berry referred to Zoom as “the Netflix of child pornography” in his closing remarks, according to The New York Times.
Zoom says it has improved its ability to police such content. But over the last several weeks, as people self-isolating at home have flocked to the service, many are seeing their happy hours and discussion groups “Zoombombed” by users broadcasting graphic pornography.
For adults, such disruptions can be a mostly laughing matter. But for children—one of Zoom’s fastest-growing constituencies, as schools and daycares shut down—the ramifications are far different. Will teachers powering up the video service for the first time realize that the onus is on them to switch on Zoom’s troll-prevention setting?
It’s also possible that Zoom has been violating the protections children are due by sending data from its iOS app to third parties including Facebook, as Vice previously reported. Zoom has since deleted the lines of code that sent data to Facebook, but during the weeks when schools were starting to experiment with Zoom, the company was passing along each user’s time zone and the identifier used by advertisers to target ads, among other information. (Mashable previously reported on Zoom’s policy regarding sharing personal data but did not name Facebook.)
After this story was published, news broke that the office of New York Attorney General Leticia James is looking into Zoom’s security and privacy practices. In addition, The Intercept reported that Zoom does not provide end-to-end encryption for its services, despite what its marketing materials say. Vice reported that the video conferencing company has leaked the personal emails and images of thousands of users.
The government has taken a few steps toward regulating tech platforms that overstep when it comes to children’s privacy. Last year, the Federal Trade Commission fined YouTube a record $170 million for allegedly violating child privacy laws. In some ways, it felt like a watershed moment for protecting children online. But in the aftermath of the fine, there has been both anger and confusion on the part of YouTube creators and parents. YouTube agreed, as part of its FTC settlement, to label child-specific videos and channels. The distinction may sound simple enough, but in practice, it is extraordinarily difficult to police and enforce. If Zoom decides to step up its privacy game, especially when it comes to child users, it may run into a similar quagmire.
For those worried about Zoom’s lax privacy policies, there are plenty of options when it comes to videoconferencing services, just as there are plenty of alternatives to YouTube. Entrepreneurs critical of Silicon Valley, like Basecamp cofounder David Heinemeier Hansson, have taken to Twitter to advocate for options that they argue are more privacy-friendly and independent of Big Tech, like Whereby, BlueJeans Network, and Lifesize. Zoom is “fundamentally corrupt,” Hansson says. Other privacy advocates see the recent criticism as a wake-up call for Zoom.
“Zoom really has no serious value if it doesn’t protect personal privacy,” Doc Searls, an author and research director at Harvard’s Berkman Klein Center for Internet & Society, wrote in a blog post. “That’s why they need to fix this.”
That’s the optimistic view. The pessimistic view is that as people scramble to get up and running with remote work and school, they don’t have time to parse corporate privacy policies. Nor do they necessarily have the leverage to force organizations that use Zoom’s service to adopt a more privacy-focused option.
Ultimately, Zoom functions better than most other options for connecting groups via video—and given people’s tendency to opt for ease and convenience over privacy, that’s likely all that will matter, unless policymakers decide otherwise.