Two months after a lone gunman live-streamed a mass shooting at two mosques in the city of Christchurch, New Zealand, in which 51 people were murdered, Facebook has announced a strict new policy aimed at fighting live-streaming abuses on the platform.
Starting today, any Facebook user who violates any of Facebook’s community standards will get an automatic restriction placed on them from using Facebook Live for a set period of time. As Facebook explained in a blog post:
Today we are tightening the rules that apply specifically to Live. We will now apply a “one strike” policy to Live in connection with a broader range of offenses. From now on, anyone who violates our most serious policies will be restricted from using Live for set periods of time–for example 30 days–starting on their first offense. For instance, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time.
You’ll notice that the one-strike policy is “for a set period of time”–it’s not a permanent Facebook Live ban. As Facebook says, the Live bans will also vary in length. Facebook gives an example of 30 days, but it seems bans could be much shorter–or much longer. The nature of the offense will probably dictate how long the initial ban lasts.
Facebook unveiled the “one strike” policy yesterday, just a day before world leaders are set to meet in Paris to discuss the “Christchurch Call”–a voluntary framework that countries can commit to to put specific measures in place to prevent the uploading of terrorist content online.
In an email from New Zealand Prime Minister Jacinda Ardern’s spokesperson, the prime minister said, “Facebook’s decision to put limits on live-streaming is a good first step to restrict the application being used as a tool for terrorists, and shows the Christchurch Call is being acted on,” reports Reuters.