Instagram is making another change as it strives for a kinder, gentler version of itself. Yesterday, it rolled out a new policy that could mean a lot more problematic accounts will be disabled.
Under Instagram’s existing policy, accounts are disabled after a certain percentage of posted content violates its terms. The new policy will do that and more. Now, Instagram will also remove accounts with a certain number of violations within a certain time frame—like if someone goes on a racist, homophobic, or violent Instagram rant—which is more in line with Facebook’s policy.
If that idea makes you worried that your edgy content will get flagged, you’re in luck: Instagram is also rolling out a new notification policy, in which it will warn users if their account is at risk of being disabled. This notification will also offer the opportunity to appeal the deletion of content, which is good news for anyone who had their, say, breastfeeding photo deleted or artwork tagged as pornographic.
According to Instagram, appeals will be initially available for content that was deleted on the grounds of “nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism,” but it will be expanding in the coming months.
That horrifying situation underscores the challenges that Instagram appears to face when it comes to pulling wildly offensive content even when the company is actively trying to do so. While Instagram isn’t saying if the policy changes are a response to that incident, it’s a good time for the social media site to appear to be taking this issue seriously. This change could be a good first step.
Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.