Under Instagram’s existing policy, accounts are disabled after a certain percentage of posted content violates its terms. The new policy will do that and more. Now, Instagram will also remove accounts with a certain number of violations within a certain time frame—like if someone goes on a racist, homophobic, or violent Instagram rant—which is more in line with Facebook’s policy.
If that idea makes you worried that your edgy content will get flagged, you’re in luck: Instagram is also rolling out a new notification policy, in which it will warn users if their account is at risk of being disabled. This notification will also offer the opportunity to appeal the deletion of content, which is good news for anyone who had their, say, breastfeeding photo deleted or artwork tagged as pornographic.
According to Instagram, appeals will be initially available for content that was deleted on the grounds of “nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism,” but it will be expanding in the coming months.
The news comes a few days after Instagram was in the headlines for all the wrong reasons when photos of a young woman’s brutally murdered body were posted on the site. The gruesome images were still available long after they were reported, as bad actors reuploaded them. Users resorted to trying to populate the hashtag and the victim’s handle with photos of pink clouds to drown out the gory images.
That horrifying situation underscores the challenges that Instagram appears to face when it comes to pulling wildly offensive content even when the company is actively trying to do so. While Instagram isn’t saying if the policy changes are a response to that incident, it’s a good time for the social media site to appear to be taking this issue seriously. This change could be a good first step.