Bullying, racism, and other toxic language have been problems on Xbox Live for as long as the service has existed. Now instead of just moderating bad behavior after the fact, Microsoft is taking a proactive approach by adding four levels of text filters to Xbox Live. Depending on their tolerance level, users can set the filters to Friendly, Medium, Mature, and Unfiltered and can have different levels for different types of communication. You might, for instance, want to allow unfiltered messages from friends but have more aggressive filters from strangers.
For now, the filters are only available with private messages and only for members of the Xbox Insider testing program. Microsoft says it will roll filters out to all users in the fall and will expand them to other forms of communication, such as “Looking for Group” requests and clubs, in the future. The Verge reports that Microsoft is also looking into filters for voice communications, but those plans are much further off.
Xbox Live launched all the way back in 2002 on the original Xbox, so on some level, it’s surprising that the service has gone this long without useful filtering tools. One way to read today’s announcement is that Microsoft finally grasps that moderation alone isn’t enough to deal with an increasing amount of toxic behavior online.
Still, coming up with automatic filters that can take content in the right context isn’t easy. “It’s one thing to say you’re going to go on a killing spree when you’re getting ready for a multiplayer mission in Halo, and it’s another when that’s uttered in another setting,” Dave McCarthy, Microsoft’s head of Xbox operations, told The Verge. “Finding ways for us to understand context and nuance is a never-ending battle.”