Fast company logo
|
advertisement

Child sexual abuse material runs rampant on the internet thanks to popular social media platforms like Facebook, even despite attempts to crack down on its spread. We need scalable technology to address it.

On social media, child sexual abuse material spreads faster than it can be taken down

[Source Photos: Kat J/Unsplash and Petro Bevz/iStock]

BY Glen Pounder and Rasty Turek3 minute read

Many of the sites and platforms that have done so much to democratize free expression around the world have also unfortunately spurred a rise in harmful and illegal content online, including child sexual abuse material (CSAM).

The internet did not create CSAM, but it has provided offenders with increased opportunity to access, possess, and trade child sexual abuse images and videos, often anonymously, and at scale. The National Center for Missing & Exploited Children (NCMEC) has seen a 15,000% increase in abuse files reported in the last 15 years. At the same time, a report from Facebook to NCMEC in fall 2020 found that only six videos accounted for more than half of reported CSAM content across Facebook and Instagram—indicating that vast networks of individuals are relentlessly sharing pre-existing CSAM.

While these criminal transactions were once confined to the darkest reaches of the web, the advent of social media platforms has unwittingly provided an efficient distribution pipeline. As a result, platforms and law enforcement agencies have struggled to contain the seemingly endless streams of CSAM. Google, Dropbox, Microsoft, Snapchat, TikTok, Twitter, and Verizon Media reported over 900,000 instances on their platforms, while Facebook reported that it removed nearly 5.4 million pieces of content related to child sexual abuse in the fourth quarter of 2020.

Facebook noted that more than 90% of the reported CSAM content on its platforms was the “same as or visibly similar to previously reported content,” which is the crux of the problem. Once a piece of CSAM content is uploaded, it spreads like wildfire, with each subsequent incident requiring its own report and its own individual action by authorities and platforms. It’s akin to an endless, unwinnable game of Whac-a-Mole, further complicated by criminal users editing and distorting photos and videos to evade law enforcement tags.

For victims of abuse, the impact is devastating. While these harmful images and videos are often the only inculpatory evidence of victims’ exploitation and abuse, rampant sharing causes revictimization each time the image of their abuse is viewed. In a 2017 survey led by the Canadian Centre for Child Protection, 67% of abuse survivors said the distribution of their images impacts them differently than the hands-on abuse they suffered; the distribution never ends, and the images are permanent.

Some offenders may never “spiral” to consuming more child abuse material in unregulated online spaces if they never access it on the major social platforms in the first place. Earlier this year, Facebook launched a feature to prevent users from searching for CSAM content on its platform. Surely, if users cannot find this illegal content, then it will be harder to spread. But a more surefire way to solve this problem is to prevent the uploading of CSAM in the first place. Digital platforms and law enforcement need technology that can identify all versions of a video, despite distortions, and do it in seconds, prior to the content being published online.

Fortunately, solutions exist today to help tackle this problem and similar surrounding issues. Our organizations, Pex and Child Rescue Coalition, partnered earlier this year to successfully test Pex’s technology, typically used for copyright management and licensing, to identify and flag CSAM content at the point of upload. Other companies—including Kinzen, which is utilizing machine learning to protect online communities from disinformation and dangerous content, and Crisp, which offers a solution to protect children and teenagers from child exploitation groups online—are also aiding in the fight to create a safer internet.

Solutions like these help to ensure that victims of child abuse find care, community, and closure on social media and break the cycle of trauma. But as we continue building technologies that connect us to others, we must also prioritize keeping those communities safe. That’s the internet we all deserve.


Glen Pounder serves as the Director of Programs of the Child Rescue Coalition (CRC), a nonprofit organization devoted to curbing abusive material online. Rasty Turek is the Founder and CEO at Pex, the trusted global leader in digital rights technology, enabling the first real-time marketplace for copyrighted content.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics