After a series of rough years, 2019 was yet another bad one for Facebook. From its toxic treatment of content moderators to the botched rollout of the cryptocurrency Libra to the much-derided policy to allow politicians to lie in ads, the social network has gone from scandal to scandal. It got so bad that Facebook even created a chatbot to give its employees PR-approved guidance on answering any tough questions they received from their families and friends during the holiday season.
And legislators were paying attention. Over the summer, the FTC fined Facebook a record-breaking $5 billion, the result of a years-long investigation into the company’s mishandling of consumer data. Facebook faced a criminal investigation into its data-sharing practices with other large corporation. Top executives exited the company. As scrutiny increased, Facebook cofounder Chris Hughes—along with Democratic presidential candidate Elizabeth Warren—called the company a monopoly and urged for it to be broken up.
It wasn’t all bad. Facebook also rolled out new products that have helped continue its dominance in social networking. It also moved toward a dramatically different vision of itself where all messages are encrypted and more communication happens in small groups. And even if some of its most recent gambles haven’t paid off yet, Facebook is working to diversify its business.
Here are Facebook’s five most wretched mistakes in 2019—and its five biggest wins.
Good: Launching an initiative to fight deepfakes
During the 2016 election, Facebook was home to an extensive Russian misinformation campaign—a mistake the company aims to avoid before 2020. One part of that problem? The looming threat of deepfakes, which are algorithmically altered videos that can make it look like someone is saying something they didn’t (the most famous of which may be a deepfake showing Barack Obama calling President Trump a “dipshit”). With an eye toward this future, in 2019, Facebook launched an industry coalition that will fund the development of tools that can spot these altered videos. As part of this “Deepfake Detection Challenge,” it will also design a benchmark to evaluate these tools, and hire actors to create thousands of videos on which to test the third party deepfake detectors.
Bad: Allowing politicians to lie in ads
But while Facebook seems to recognize the problem inherent in a technology that allows anyone to create realistically altered videos, the company doesn’t seem to have any issues with politicians who present a completely altered version of reality in their targeted advertisements.
In 2019, CEO Mark Zuckerberg faced withering criticism over his decision to permit egregious lies in political ads, which he said was due to Facebook’s commitment to free speech and democracy in a live-streamed speech at Georgetown University. Many argued that the problem with Zuckerberg’s stance is that politicans’ ads on Facebook aren’t a matter of free speech, since ads are paid speech that can be microtargeted to narrow slices of the population. In contrast, Google has eliminated microtargeting for political ads, and Twitter has banned political advertising altogether.
Good: Doubling down on Instagram Shopping
One positive development for Facebook this year was the launch of direct checkout for Instagram Shopping, a feature that made it even more effortless to purchase whatever you see in the incessant number of ads on the photo-sharing platform. The feature formalized a way that users were already using Instagram—to discover stuff to buy, or to advertise and sell products—by letting them check out without ever leaving the app. The feature went into beta in spring 2019 with a handful of companies, including Nike, Warby Parker, H&M, and Prada. In December 2019, Instagram also launched shoppable videos, started with a shoppable remake of Celine Dion’s 1996 classic “It’s All Coming Back to Me Now.”
According to Instagram product management lead Ashley Yuki, 130 million people engage with commerce-related posts on Instagram every month, and 80% of users follow a brand on the platform. Instagram Shopping is a prime example of Facebook’s general product strategy: see how people use a product organically, and then build dedicated tools to support them.
Bad: Rebranding Instagram and WhatsApp as ‘From Facebook’
While Instagram is going strong, Facebook has also made the mistake of trying to rebrand the app, adding “From Facebook” to the photo-sharing platform’s home screen. The same goes for WhatsApp, Facebook’s other flagship product that most people don’t know is actually owned by Facebook.
Facebook claims that the goal is to be clearer that it is the parent company of the two popular apps. But doing so will also saddle these apps with the company’s terrible public image. As Mordecai Holtz, the chief digital strategist of Blue Thread Marketing, wrote in Fast Company earlier this year: “Slapping a ‘from Facebook’ label onto Instagram is associating the burden of security issues, government oversight, and overall lack of cool with a social media platform that is already dealing with its own privacy breaches.”
As part of a general rebranding, the company also launched an overarching corporate rebrand—”FACEBOOK,” in all-caps—to differentiate the corporate giant from Facebook the product. It’s meant to be a confident statement about the company’s identity. But all in all, it just feels like a pointless design exercise that won’t stave off declining public trust.
Good-ish: Embracing encryption
WhatsApp has long been encrypted—and in 2019, Zuckerberg announced his vision to make the rest of Facebook’s messaging services, including Facebook Messenger, encrypted as well. The move is part of a general pivot toward private messaging and small-group conversation, rather than the semi-public newsfeed for which Facebook has been known for most of its existence.
When he announced this shift in March 2019, Zuckerberg framed it as the company transitioning to become privacy-first. That’s misleading: Facebook intends to bring encryption to more of its services, but has disclosed no plans to curtail the amount of data it collects about you. While encryption is generally a good thing when it comes to privacy and security, skeptics say that the transition might be more about avoiding responsibility for child pornography, misinformation on topics such as vaccination, and other horrifying content that is already passed around on the social network. In addition, Zuckerberg aims to integrate the encrypted backends of WhatsApp, Messenger, and Instagram Direct so users can send messages between all three platforms—a change that could also make it harder to break up the company, as critics are advocating.
Bad: Treating content moderators terribly
In the wake of 2016, Facebook started devoting a huge amount of resources to policing the content on its site. But in 2019, reports came out that elucidated the shocking working conditions that human content moderators face when working for Facebook (or for a company that Facebook hires to do this dirty work). Several investigations by The Verge’s Casey Newton and others described a culture of fear, extremely low salaries, bed bug infestations, and lack of mental health resources that led some to develop the symptoms of PTSD.
Ultimately, Facebook would love to completely automate its moderation process, so that it’s AI, not humans, that make decisions about what stays on the site and what is removed. But in the meantime, the stories of content moderators—and Facebook’s decision not to treat them as full employees or provide adequate benefits—reveal the human collateral of the social network’s practices.
Good: Testing the removal of likes from Instagram
Despite Facebook’s lack of care about its content moderators, it has started to pay attention to the negative impacts of its interface for its users. In 2019, the company started testing the removal of likes from Instagram. That doesn’t mean you won’t know how many people liked your latest vacation shot—but everyone who scrolls past your post won’t instantly be jealous of how popular you are, because they won’t be able to see who else liked it.
The change may also make it to the main Facebook platform. The company started testing it out after talking to mental health experts, but it may also help reduce the spread of misinformation.
Bad: Trying to put cameras and microphones in every home
In some cases, Facebook listens to the grievances of its users—like when it comes to how like counts make Instagram feel like a toxic popularity contest. But in others, it appears utterly tone-deaf. That has been the case with its Portal line of video-chat hardware, which expanded with the launch of Portal TV in September 2019. After years of data breaches and reports about how cavalier Facebook is with your personal information, the company decided it would be a good idea to sell a hardware product that puts a camera and microphone inside your home.
While Facebook initially claimed that what Portal sees and hears won’t be shared with advertisers (just what Facebook Watch shows you stream through Portal, and which users are making video calls), it turns out the Portal is not as private as it might appear. A month after the company launched Portal TV, users discovered a vulnerability where they could access other people’s photos without their knowledge (Facebook since removed this ability).
In general, Portal might not be working out too well for Facebook. In October, my colleague Mark Sullivan reported that sales of the devices, which the company first started selling in December 2018, are “very low,” according to supply chain sources.
Good: Betting on new types of businesses
While Portal might not be a strong business line thanks to Facebook’s lack of trustworthiness, the company is trying to diversify its business. Right now, 98.5% of Facebook’s revenue comes from ads, but it’s looking into new avenues that will either make it less dependent on advertising—or increase its ability to microtarget you.
That includes experimenting with augmented reality glasses, virtual reality, and even brain-computer interfaces. In 2019, Facebook acquired a company called CTRL-Labs, which claims to have a bracelet that can decode your brain waves. But it also means other kinds of bets, like podcasts, travel apps, newsletter tools, and even email. According to the New York Times, Facebook has created a separate LLC to build some of these new products, so that the people creating them will be able to experiment more freely. Facebook might not ever become anyone’s choice for email, but the company’s dedication to trying new things could serve it well in the future.
Bad: Launching a cryptocurrency
Perhaps the wackiest of Facebook’s other bets? Libra, a global cryptocurrency, that it announced plans to develop in 2019 after much speculation. The currency won’t be exclusively owned by Facebook—instead, it will be governed by the Switzerland-based Libra Association, a nonprofit that is supposed to be made up of 100 companies. But since Facebook declared its intentions, regulators have not been having any of it. While Zuckerberg won’t have ultimate control over Libra and says he will simply use it to provide an easier way for Facebook friends to swap cash, others—like Fed chair Jerome Powell, Financial Services Committee chair Maxine Waters, Democratic senator Sherrod Brown, and Treasury Secretary Steven Mnuchin—see it as a threat to the global financial system.
The scrutiny has already led to several of the big players that had signed up to be part of the Libra Association bowing out, including eBay, Stripe, Mastercard, and PayPal. Libra is still supposed to officially launch in 2020—maybe. How Facebook handles the launch after so much criticism might indicate what the company’s next decade could look like: a continued fight with regulators in a brash quest for world domination.