For all the heightened concern over deepfakes being used to manipulate elections and sow chaos in countries around the world, the vast majority of them are actually much more juvenile in nature—mapping the faces of female celebrities onto the bodies of porn stars (which shouldn’t surprise anyone familiar with the history of the internet).
In our recent story on full body deepfakes, an Amsterdam-based company called Deep Trace Labs—which identifies such AI-manipulated content—hinted at their extensive research into the synthetic media landscape. Today, the company published a much-anticipated report, The State of Deepfakes: Landscape, Threats, and Impact, in which they explore the deepfakes prevalent on websites, forums, and mobile apps.
As Deep Trace Labs’ Giorgio Patrini explains, the research began when the company formed in 2017 and involved a comprehensive examination of websites, forums, and services. Researchers combed through forum and deepfake creation community posts, identifying obscure and niche elements to get a full understanding of the deepfake threat landscape. Deep Trace collected data from these websites, channels, forums, and communities via public APIs and ad-hoc tools they developed internally. It also involved looking at websites or YouTube channels where not all content was likely to be deepfake in nature.
Deep Trace found that currently there are 14,678 deepfake videos online, 96% of which are pornographic. Most of that number are the faces of famous female actresses mapped onto porn star bodies. Indeed, most deepfake targets are women, while most non-pornographic ones feature men. Over 90% of deepfake YouTube videos featured Western subjects, from actresses and musicians to politicians and corporate figures. But Patrini emphasizes that this is not just a Western phenomenon.
“Non-Western subjects featured in almost a third of videos on deepfake pornography websites, with South Korean K-pop singers making up a quarter of the subjects targeted,” says Patrini. “This indicates that deepfake pornography is an increasingly global phenomenon.”
“Our data showed that the majority were actually still of actors but with a notable minority of corporate and political leaders,” says Patrini. “I think this can be attributed to the mechanism of vitality that is associated with non-pornographic deepfakes. The creators are primarily hobbyists who are trying to create the high-quality fakes. Choosing well-known figures such as Elon Musk, Donald Trump, or Nicholas Cage means your deepfake is more likely to viewed and also adds a comedic element (i.e., Nicholas Cage’s distinctive visage on Lois Lane’s body in Superman).
Over the past few years, there has been an explosion of papers on Generative Adversarial Networks (GANs), which are two neural networks—a synthesizer or generator and a detector—that create deepfake images or video, then scrutinize the quality in a feedback loop until the final product is convincingly refined, according to the report. GANs are by no means behind deepfakes, but Patrini says they are certainly among “the most popular and effective generative methods powered by deep learning.”
Deep Trace’s research suggests that the noticeable growth in publication volume of GANs could only indirectly be correlated with the two phenomena. Deep Trace could not establish a direct causal link between more GAN research and the rise in deepfakes.
“More people, and not just PhD researchers, are able to experiment with and come up with novel algorithmic variations, [and] we can indirectly measure this by the growth in papers” says Patrini. “Publication of more experimental work and extension to new applications is indicative of these ideas potentially being transferred into reusable code and more reliable and efficient tools that can be used by nonexperts.”
Deep Trace also noted that deepfake creation communities are growing. Github, 4chan, 8chan, and other forum-based websites all share open-source deepfake code.
“A key point we noted in the report is that certain communities had very different motivations and activity compared to others,” Patrini explains. “If we are looking at negatives, it was clear that some communities and forums were primarily focused on using deepfake creation tools for non-consensual deepfake pornography.”
Some of these communities, Patrini says, are well known for hosting fringe content that is illegal or borderline, like 4chan and 8chan. The commodification of deepfake creation tools on these platforms will likely lead to the technology perpetuating harmful and malicious use cases, like cyberbullying or political propaganda.
But Patrini is quick to note other platforms are populated by hobbyists and other individuals who appear to be more interested in making deepfakes similar to Deepfake YouTubers, like user Ctrl Shift Face, who famously morphed Tom Cruise’s and Al Pacino’s faces onto comedian and impressionist Bill Hader.
“I wouldn’t say these uses are explicitly positive, but they certainly aren’t malicious on face value,” says Patrini. And because deepfakes are not just a Western phenomenon, Deep Trace believes it will require global action to counter malicious uses.
“There is significant opportunity in deepfakes for businesses, as observed by the number of different tools and websites that have become available,” says Patrini, noting that this feeds into the commodification and growing accessibility of the tools. “And the idea of deepfakes alone is enough to destabilize political processes.”