Fast company logo
|
advertisement

TECH

Here’s what Facebook is doing to ramp up groups (and fight their misuse)

At their best, groups are one of Facebook’s biggest assets. At their worst, they’re a problem for society.

Here’s what Facebook is doing to ramp up groups (and fight their misuse)

[Photos: Facebook; Greg Rosenke/Unsplash]

BY Harry McCracken7 minute read

I’m not sure when it happened. But at some point in the last couple of years, my use of Facebook in its most familiar form—posting on my wall and those of other members—has dwindled. Instead, I spend most of my time in Facebook groups devoted to a variety of my interests, from old cartoons to e-bikes.

There’s no question which Facebook groups are the best. They’re the ones that are managed by administrators and moderators who care enough to have a strong point of view that manifests itself in how they cultivate conversation. That includes how they deal with trouble, from minor tiffs between well-intentioned members all the way up to full-blown troll attacks.

“Related Discussions” will raise the visibility of public groups. [Photo: Facebook]
And Facebook takes such folks and their needs seriously. Last week, in conjunction with its annual Communities Summit—normally an in-person gathering and this year a virtual event—it announced abunch of features for groups. Some aim to make it easier for admins and moderators to do their work; others are about breaking down barriers that prevent users from finding and participating in groups they might like.

On the first front, a new set of tools called Admin Assist will tend to some of the heavy lifting of moderation. For example, a group’s creator will be able to automatically reject posts that use specific keywords or come from members who have recently joined or been troublemakers in the past. Also new are various features for real-time chat, ask-me-anything-style Q&As, and conversations driven by shared photos.

Facebook is also introducing a way for admins to make money from their groups—not through advertising, but by giving them access to an existing service called Brand Collabs Manager. In the past, influential individual members have been able to use this service to strike promotional deals with brands that want to reach specific audiences; now, groups will be able to do so as well.

The Brand Collabs Manager will help group admins strike marketing deals that leverage their audiences. [Photo: Facebook]
Then there are the tweaks designed to get more users into more groups. “Related Discussions” will push material from groups into users’ news feeds, exposing them to new groups and conversations. And public groups will now let new members join without being approved by an administrator or moderator. (People who run groups can still use techniques such as asking screening questions to vet newcomers before allowing them to post.)

What all of these diverse changes have in common, VP of the Facebook app Fidji Simo told me, is that they reflect the evolution of Facebook Groups since the feature’s introduction a decade ago.

“When you go back to when we created Groups, it was really meant to be a space for you to connect to groups of people already in your life—your family, your soccer team, your book club,” she says. “But since then, Groups has evolved massively to be a place that is not just about connecting with people you already know, but with people all over the world on any topic that is important to you.” Today, 1.8 billion people use the feature each month, and half of all members belong to at least five active groups.

Admin Assist will help groups automate some of the basics of running a group, such as dealing with spammy posts. [Photo: Facebook]

The dark side of community

Now, it must be noted that there’s nothing inherently ennobling about groups on Facebook. Indeed, they have proven an efficient way to spread hate and misinformation. The most appalling recent example: Facebook’s failure to swiftly act against a group that coordinated plans to shoot protesters in Kenosha, Wisconsin, as reported by Buzzfeed’s Ryan Mac and Craig Silverman. More recently, The New York Times’s Ben Decker wrote about an 1,800% increase in membership for anti-mask groups on the service. And Mother Jones’s Kiera Butler has explained how QAnon hoaxes seep into Facebook groups about parenting.

When I spoke to Simo, she devoted a sizable chunk of her time to Facebook’s measures to fight hateful and otherwise dangerous use of groups. “I want to be clear that none of what we are doing matters until we keep people really safe in groups,” she told me.

She rattles off stats relating to the company’s efforts on that front: “We’ve actually removed about 1.5 million pieces of content for violating our policies on organized hate, 91% of which we actually found proactively before people reported that to us. We also removed about 12 million pieces of content in groups for violation of policies on hate speech, 87% of which we found proactively. And when it comes to groups themselves, we will take down an entire group if it repeatedly breaks our rules, or if it was set up with the intent to violate our standards.”

advertisement

Facebook is also tightening the screws on administrators who have violated its community standards in the past. For instance, a new policy prevents such people from creating any new groups for 30 days—which, though it certainly falls short of zero tolerance, might discourage them from further misbehavior.

In addition, the company is removing health-related groups from its recommendations—not because there aren’t valuable health groups on Facebook, Simo says, but because “we want to make sure that people get their health information from authoritative sources.” Presumably, the bottom line is that the company isn’t trying to police the quality of health groups on the service, but at least doesn’t want to actively steer people to sources of dubious advice.

As with everything else on Facebook, the scale of Facebook Groups is such that the company can make lots of progress and still have major problems on its hands. For example, while Facebook announced in August that it’s banned or restricted more than 2,800 QAnon-related groups, multiple reports have said it was slow to take action and that such groups have proven adept at evading the crackdown. Moreover, there is no blanket ban on QAnon groups, just on ones that run afoul of policies such as bans on “coordinated inauthentic activity” and calls for violence. (You don’t have to dig deep to find groups that self-identify as being dedicated to QAnon advocacy; actually, all you have to do is search for “QAnon.”)

And then there’s the positive side

For all the alarming stories relating to Facebook Groups, there are also inspiring ones. In 2015, Latasha Morrison wanted to spark a healthy dialogue about racial disparities and injustices in the U.S. “I wasn’t an organization,” she says. “I was just a person who saw the brokenness in the world and wanted to create some conversations and some solutions around the racial problem in America. I didn’t have a website, and I wanted to gather people where they could continue to learn. And Groups was the best option for that.”

Be the Bridge shows that creators have quite a bit of ability to mold their groups to fit unique needs and desires.

Almost six years after its inception, Morrison’s group, Be the Bridge, has 72,000 members, having undergone particularly rapid growth in recent months as the issues it covers have been among 2020’s biggest news stories. It’s spun off smaller, more focused groups and country-specific global ones. Morrison has also turned Be the Bridge into a full-fledged nonprofit organization and aNew York Times best-selling book.

Along with being an example of a group that’s a thriving force for good, Be the Bridge shows that creators have quite a bit of ability to mold their groups to fit unique needs and desires. It provides new members with a reading list; requires them to lurk, learn, and listen for their first three months before posting; and observes Sunday as a day of rest with a no-posting policy (except for one thread that members can comment on). Morrison has also participated in a residency program for Facebook community leaders which gave her input on feature development.

Early on, Morrison says, she spent 20 or more hours a week personally running her Facebook group. Now, around 23 people are involved in managing it, mostly on a volunteer basis, allowing her to turn most of her attention to other Be the Bridge activities. Between the 2020 election and cases such as the deaths of Breonna Taylor and Ahmaud Arbery, she says, ensuring that the group stays on its positive track has “been quite challenging, and it’s exhausting.”

Even the way Be the Bridge deals with problematic members—some of whom simply take a while to figure out how to be constructive participants—is a reminder that the best groups are expressions of a particular vision. One of Morrison’s favorite administrative tools lets group managers mute a member for 30 days, preventing that person from posting or commenting. “We don’t just want to throw people out,” she says. “Because we want to approach this conversation with compassion and grace and empathy—just like we’re asking people to come into the conversation with.”

Not all of Facebook Groups’ new features will have any impact on Be the Bridge—for one thing, as a private group, it won’t be surfaced into users’ feeds like public groups will be. But could the new Admin Assist tool set help reduce the workload for Morrison’s team? Faitth Brooks, director of programming for Be the Bridge, says it’s worth considering, but remains cautious: “We would try it out in a smaller group just to see. Automation can’t always pick up and understand the nuances of some of the things that we have to deal with as a group focused on racial justice.” For both better and worse, it’s human beings, in their infinite complexity, that make Facebook’s communities what they are.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Harry McCracken is the global technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World More


Explore Topics