advertisement
advertisement

This logo is like an “organic” sticker for algorithms

The food we eat has quality certifications. Why shouldn’t the algorithms that shape our world?

This logo is like an “organic” sticker for algorithms
[Image: Katie Falkenberg/courtesy ORCAA]

For fruits and veggies, there’s organic. For coffee and clothes, there’s fair trade. Now, algorithms have their own certification mark: a seal of approval that designates them as accurate, unbiased, and fair.

advertisement
advertisement

The seal is the brainchild of Cathy O’Neil, a statistician and author who has written extensively about how biased algorithms exacerbate inequality in society. Her writing–both on her blog Math Babe and her influential 2016 book Weapons of Math Destruction–has become a touchstone in conversations about the way algorithms in the areas of hiring, insurance, criminal justice, and credit can negatively impact people’s lives. Right before the election, O’Neil launched her own company, O’Neil Risk Consulting and Algorithmic Auditing, or ORCAA, with the aim of helping organizations, including the rental startup Rentlogic, that rely on algorithms to ensure that they’re not accidentally harming people.

“People don’t really check that things are working,” she explains. “They don’t even know how to ask the question.”

[Image: Katie Falkenberg/courtesy ORCAA]
O’Neil’s new venture tackles the problem inherent in using algorithms that are optimized for certain metrics over others. For instance, Facebook optimizes for engagement to maximize profits, rather than factuality or civil discourse. Of course, only companies that care about fairness and accuracy will submit to an audit, which means that the worst perpetrators of algorithmic injustice would likely never agree to O’Neil looking at their code. But ORCAA offers companies and organizations that do care about a service designed to ensure their algorithms aren’t inaccurate, discriminatory, or unintentionally (or not) breaking civil rights laws.

Along with its audit, the company bestows a visual seal designed to act as a signal to users that a company is trustworthy–and that, on a basic level, its products use algorithms in the first place. The seal is a visual emblem that translates the audit into branding and marketing value, taking a step toward a world where companies proudly display their dedication to honest algorithms.

To test a particular algorithm, O’Neil creates what she calls an “ethical matrix”: a complete list of the company’s concerns, like profit, efficiency, and data quality, as well as the concerns of anyone who the algorithm could impact, whether that’s people of different races, genders, and abilities. Then, she methodologically tests the algorithm for each concern and color-codes the matrix: Green means all good, yellow means there could be a problem, and red means that harm is being done in some capacity. Once she’s had a conversation with the company’s leaders about the ethical matrix and where they stand, O’Neil works with the company’s coders and data scientists to adjust its algorithms in such a way that removes the red boxes from the matrix. The length of time it takes to certify an algorithm depends on how many people it impacts; one recent client took four months. O’Neil says she charges reasonable hourly consulting rates, with a separate cost for the stamp of approval.

While the crux of her company is the actual auditing process, the certification itself is just as important. O’Neil gives organizations that have passed the audit a certificate and the seal of approval to put on their website. The seal is a statement about a company’s values–and its brand.

advertisement

So far, ORCAA has certified Rentlogic and one other company and has a dozen or so more that are interested. Right now, O’Neil is working with a law firm on analyzing how a recidivism algorithm was used in a parole hearing, and helping Seimens build an internal auditing system. ORCAA’s seal of approval has turned into real monetary value for Rentlogic, which received funding from an investor in part because the VC was so impressed with the startup’s initiative to get the certification, as Wired reports.

“If [the certification is] trustworthy, my seal of approval will do exactly what it’s supposed to do, which is transfer trust from people deploying algorithms to people who have a stake in the algorithm,” O’Neil says. “They’ll trust that it’s been examined, been fairly adjudicated, that their concerns have been balanced with the concerns of the people building it.”

The seal also has value because it’s a recognizable format: We’re used to seeing food items and products that have been certified for environmental friendliness and safety.

“For better or worse, it’s something people are used to, like the organic certification,” says Katie Falkenberg, the designer who created ORCAA’s logo and seal. “From a design standpoint, takes up a lot less real estate than 17 pages of text explaining what that means.”

Falkenberg says that O’Neil requested an image of a killer whale for her company’s logo that looked “fat and fierce.” [Image: Katie Falkenberg/courtesy ORCAA]


These systems alone don’t engender trust. Falkenberg points out how “organic,” despite its ubiquity, is a loaded term, with a host of somewhat arbitrary requirements and negative side effects, like farmers going out of business. Even O’Neil points to another example of how certifications aren’t always the answer: After all, bestowing risky, obtuse mortgage-backed securities with “AAA” ratings helped lead to the 2008 financial crisis. She plans to make her seal of approval completely transparent and open-source, with extensive writing on her website that details exactly what it means and what the process is.

advertisement

Right now, the seal is a simple ring design with ORCAA’s killer whale logo and text that reads, “Algorithm audited for accuracy, bias, and fairness,” with the date. Falkenberg hopes to one day update it so it gets timestamped from the date it’s uploaded to a company’s website. Because algorithms are constantly changing, Falkenberg wants the seal to let users know when an algorithm was last certified. O’Neil says algorithms should be regularly audited–perhaps once every two years or so, depending on the complexity of the code. Falkenberg also hopes to link the seal to O’Neil’s website so users can understand exactly what it means when they see it.

[Image: Katie Falkenberg/courtesy ORCAA]
Beyond giving companies the chance to prove to their users that they take algorithmic fairness seriously, the seal also has an educational component. “It flags something for people to say, ‘Oh, I didn’t realize they were using an algorithm, I guess I should consider what other websites might be using algorithms,'” Falkenberg says. She imagines one day being able to install a browser extension that blocks all sites that haven’t had an algorithm audit.

O’Neil hopes the ethical matrix and the seal are a way to bring problems of bias and fairness to companies’ attention and start a conversation that the wider public can participate in. However, she’s clear that widespread algorithmic fairness is still a long way away. Her current clients have come to her because they want to engender trust with users–they’re already aware of how algorithms can harm users. These aren’t the companies building truly nefarious software. She doesn’t expect those organizations to allow her to audit their tech unless these kinds of audits are required by law.

“I didn’t start this business to only deal with algorithms that are pretty good,” she says. “I want to eventually nail the truly terrible, destructive algorithms to the wall and say, this is not good enough, we deserve better than this.”

advertisement
advertisement

About the author

Katharine Schwab is an associate editor based in New York who covers technology, design, and culture.

More