advertisement

Some Black users say their posts are stifled by the networking site. But while LinkedIn insists issues are due to an unfortunate mix of technical and moderation errors, the platform still has to convince users that’s the case.

As Black users complain of censorship, LinkedIn faces a perception problem

[Photo: Sundry Photography/iStock]

BY Steven Melendez10 minute read

Walter Geer describes himself as one of the few Black creative directors in the ad industry. An executive creative director for experience design at the marketing agency VMLY&R, Geer is also a frequent and popular poster on LinkedIn, where he discusses issues faced by Black people and other people of color in the industry.

“I do this because I’ve been thrust into this role not by choice—I’ve become a voice for people of color in advertising,” he tells Fast Company. “It’s important to me because progress doesn’t happen unless we have these conversations.”

But Geer says LinkedIn hasn’t always felt welcoming to these discussions. He and other Black users of the platform have repeatedly had issues with material they post not appearing on the site or being taken down. In one case, Geer says, a video he uploaded to the site seemed to simply disappear, although LinkedIn says it was never properly uploaded, as a result of a technical error on the company’s end.

He and other Black creatives, particularly in the ad industry, told Fast Company they see signs the platform is biased against them. They point to posts about being Black in the workforce and other racial issues that don’t post or get removed, content about race and diversity not getting the traffic they expected to receive, and cryptic messages from LinkedIn customer support representatives about their experiences. They say the platform, which since last summer’s protests after the murder of George Floyd has increasingly become home to discussions of race in the workplace, seems to be censoring discussions they consider critical to improving their professional lives and those of their colleagues.

LinkedIn representatives are adamant the issues Geer and others experienced aren’t the result of any kind of systemic bias, and instead stem from a mix of content-neutral technical bugs, the popularity among other users of particular posts, and the occasional moderation error. They also point to examples of other posts by the users, including some that discuss racial issues, that didn’t have these problems. “End of the day, LinkedIn is a members-first company,” says Paul Rockwell, head of trust and safety at LinkedIn. “So making sure that we’re creating a community where members can feel supported, and they can count on a safe, constructive professional environment, is of paramount importance to us.”

But the specific issues that users experienced contribute to a larger perception problem, tied to the reality that social media sites are governed by mysterious and opaque algorithms and moderation processes developed behind closed doors. It’s not limited to LinkedIn: Facebook users also complain when their content is taken down by the network for violating its often complex rules. Black TikTok users have expressed fears that their content has been shadowbanned, a term that refers to surreptitiously limiting the reach of a post, something TikTok has generally denied. Conservatives have also expressed concerns about being shadowbanned by Twitter, although that company also says it doesn’t engage in the practice.

“What I haven’t seen platforms do on the whole is adopt systems that really meaningfully involve user communities in those processes,” says Sarah Myers West, a researcher at the AI Now Institute at New York University, who has studied content moderation systems.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Steven Melendez is an independent journalist living in New Orleans. More


Explore Topics