advertisement
advertisement

Zuck’s manifesto reveals how Facebook aims to solve one of its biggest problems

More than 16.4% of all the humans alive use Facebook every single day. That means, of course, that community standards for content—what’s allowed, what’s not—paint strokes that are too broad for some people, and not broad enough for others. Some are not offended by nudity but hate violence; for others, it’s the reverse.

In the 6,000-word manifesto Mark Zuckerberg posted today, he said the company is planning to re-cast the way this core Facebook issue is handled, putting more control over what individual users see in their own hands, and trying to apply standards more locally. 

“The guiding principles are that the community standards should reflect the cultural norms of our community, that each person should see as little objectionable content as possible, and each person should be able to share what they want while being told they cannot share something as little as possible.”

The idea is that users will periodically be asked what they want or don’t want to see when it comes to nudity, violence, graphic content, profanity, and the like. Those that don’t respond would be subject to standards set by the majority of users in their geographic region. Personal updates will always be possible. Implementing changes that accurately reflect users’ desires is a function of artificial intelligence, and Zuckerberg said it will take years to complete the project—which he seems to ultimately want to serve as an avatar for similar large-scale problems on Facebook, such as fake news, civic involvement, and safety. “Our hope is that this model provides examples of how collective decision-making may work in other aspects of the global community.”

[Flickr user Alessio Jacona]DT