But with the power of this technology–especially at the scale of Facebook and its 2 billion users–comes responsibility, says Facebook data scientist Isabel Kloumann, and particularly the need to mitigate against bias creeping into its AI systems. Of course, the need for Facebook to take more responsibility for its technology has been front and center in the wake of its many security- and privacy-related controversies in recent weeks and months.
Taking responsibility for handing AI can’t, and won’t, happen automatically. It’ll take Facebook, and the countless other companies that are increasingly relying on AI across many industries taking proactive steps to, as Kloumann put it today at F8, Facebook’s developers conference, building fair and unbiased algorithms. And that begins with ensuring that the teams behind those systems are themselves as diverse as possible. “If AI only learned from a small group,” she says, “we will only see a narrow view.”
Even such efforts to increase diversity are difficult, she adds, because everyone has their own unconscious biases. For example, no one can eliminate the way they see other people–their skin color, their weight, their gender, and so on. And whatever associations with things we may have, they’re no doubt different from the people around us. “We need to understand and mitigate our biases,” Kloumann says, “so we don’t pass them on [and] so our AI can do better.”
That’s why Facebook uses an external review process to bring in a multitude of voices that help the company ensure there’s an ethical framework to its AI systems–people with expertise that ranges from technology to social science and beyond. The goal? To make sure Facebook’s AI systems have the most positive impact on people, Kloumann says.
Avatars And Jobs
As it advances its social virtual reality technology, Facebook is trying to build more realistic avatars, and it’s relying on AI to assist with that. But in order to have the most diverse range of possible avatars, it needs to train its AI on a huge diversity of actual faces, she explains.
That means doing substantial manual training–adding labels by hand to the many attributes our faces have–things like hair, skin color, mouth shapes, and so on. Facebook needs to make sure those labels are accurate and unbiased.
The same is true of the way it applies AI to things like ranking news articles, and many other areas. And that’s why the company has developed what Kloumann calls bias mitigation guidelines–to figure out where bias creeps in and try to keep it from doing so. “You need to ask what has your AI learned from” all the data you feed it, she says.
One area she says is important to get right is job recommendations. The majority of job creation in America is in small businesses, and people use Facebook’s job hunting tools to find employment opportunities at many such businesses. Kloumann says the trick with AI was to make sure that the tools weren’t biased in favor of any demographic groups over others. “We want to ensure our job algorithms are providing opportunity equally,” she says, across all genders, ages, sexual orientations, and other demographics.
One thing Facebook has done is built fairness into tools that are available through FBLearnerFlow, a system that all company engineers use to find libraries for their AI projects. That means that any engineer can use preexisting tools to evaluate the fairness bias of their projects, and draw on existing best practices.
But these are still early days, Kloumann says, and it will take much more work, both inside Facebook, and elsewhere to solve these problems. The most important questions that have to be answered sit at the intersection of different communities–mathematics, social science, ethics, and others. Everyone will have to work together to solve them, she says.
AI is a powerful and transformative technology, Kloumann says, and harnessing it for social good requires that everyone work together. “AI isn’t exactly our child,” she says, “but it is our responsibility, and that belongs to all of us. So let’s work together to teach it.”