A woman on Twitter is abused every 30 seconds

And it’s even worse for women of color.

A woman on Twitter is abused every 30 seconds
[Photo: rawpixel/Unsplash]

That shocking statistic comes from a study conducted by Amnesty International and AI software startup Element AI. In the study, called Troll Patrol, Amnesty International and Element AI looked at data from 288,000 tweets sent to 778 female politicians and journalists in the U.S. and U.K. in 2017. Using machine learning on the data, the group then extrapolated just how wide-ranging abuse toward women is on Twitter. The result: 1.1 million abusive or problematic tweets were sent to the women in the study during the year–that’s one abusive or problematic tweet every 30 seconds.


And it’s even worse for women of color–and especially black women–who were targeted more frequently than white women. As Milena Marin, senior advisor for tactical research at Amnesty International, explained in a blog post:

“Troll Patrol means we have the data to back up what women have long been telling us – that Twitter is a place where racism, misogyny and homophobia are allowed to flourish basically unchecked. We found that, although abuse is targeted at women across the political spectrum, women of color were much more likely to be impacted, and black women are disproportionately targeted. Twitter’s failure to crack down on this problem means it is contributing to the silencing of already marginalized voices.”

And it didn’t even matter what side of the political spectrum women sat on, as the findings revealed that of the female journalists included in the study, they all faced the same level of abuse despite writing for ideologically opposed publications including the Daily Mail, the New York Times, the Guardian, the SunGal-DemPinkNews, and Breitbart. Other key findings from the study:

  • Black women were disproportionately targeted, being 84% more likely than white women to be mentioned in abusive or problematic tweets. One in ten tweets mentioning black women was abusive or problematic, compared to one in fifteen for white women.
  • 7.1% of tweets sent to the women in the study were problematic or abusive. This amounts to 1.1 million tweets mentioning 778 women across the year, or one every 30 seconds.
  • Women of color, (black, Asian, Latinx and mixed-race women) were 34% more likely to be mentioned in abusive or problematic tweets than white women.
  • Online abuse against women cuts across the political spectrum. Politicians and journalists faced similar levels of online abuse and we observed both liberals and conservatives alike, as well as left and right leaning media organizations, were affected.

We’ve reached out to Twitter for comment about Amnesty’s report and will update this post when we hear back.

Update: We’ve been provided with some extracts from Vijaya Gadde’s response to Amnesty on December 12. Gadde is Twitter’s Legal, Policy and Trust & Safety Global Lead. Gadde highlighted how Twitter monitors and handles abusive tweets deemed to be in violation of its abusive behavior policy. Other extracts from Gadde’s response to Amnesty:

“Twitter has publicly committed to improving the collective health, openness, and civility of public conversation on our service. Twitter’s health is measured by how we help encourage more healthy debate, conversations, and critical thinking. Conversely, abuse, malicious automation, and manipulation detract from the health of Twitter. We are committed to holding ourselves publicly accountable towards progress in this regard…

“With regard to your forthcoming report, I would note that the concept of “problematic” content for the purposes of classifying content is one that warrants further discussion. It is unclear how you have defined or categorized such content, or if you are suggesting it should be removed from Twitter. We work hard to build globally enforceable rules and have begun consulting the public as part of the process — a new approach within the industry…

“As numerous civil society groups have highlighted, it is important for companies to carefully define the scope of their policies for purposes of users being clear what content is and is not permitted. We would welcome further discussion about how you have defined “problematic” as part of this research in accordance with the need to protect free expression and ensure policies are clearly and narrowly drafted.”

About the author

Michael Grothaus is a novelist, journalist, and former screenwriter. His debut novel EPIPHANY JONES is out now from Orenda Books. You can read more about him at