“On the Internet, nobody knows you’re a dog.” That’s the caption from a 1993 New Yorker cartoon by Peter Steiner, spoken by one dog seated at a computer to another dog. The cartoon spoke volumes about the perceived anonymity that had come to define large swaths of the early internet. Perhaps nowhere was the ability to interact and transact anonymously more transformative than in the first generation of e-commerce platforms. Hypothetically speaking, an espresso drinker from Cambridge, Massachusetts, could go online and buy a used Jura Impressa J9 espresso machine, sight unseen, from a stranger living in Berkeley, California. (If you’ve spent time in both towns, you would likely be unsurprised if these two cities were involved in such a transaction.)
In the internet age, life insurance got cheaper. Travel became easier to plan. A wider variety of books was sold. In each of these examples, the internet allowed markets to become more efficient and left consumers better off. Things were looking good in the early days of online shopping.
Beyond these efficiency gains, there was another, subtler implication of the rise of e-commerce.
By facilitating arm’s-length transactions that obfuscated and deemphasized markers of race and gender from the buying process, the internet created the potential to reduce the discrimination that had long plagued offline markets. Consider, for example, the case of car purchases, where discrimination has been documented for decades. Economists Fiona Scott Morton, Florian Zettelmeyer, and Jorge Silva-Risso found that car sales initiated online exhibited less of the racism in price that persisted in transactions conducted offline and in person. Markets were not only becoming more efficient—they were becoming fairer as well.
But this techno-utopian vision was not to last.
— MIT CSAIL (@MIT_CSAIL) July 5, 2019
The New Yorker updates its cartoon
Fast forward to the early 2010s. eBay and Amazon had been around for 15 years. A new crop of platforms were emerging—such as Airbnb, Uber, and Upwork—and making important design choices that would ultimately shape the efficiency and equity of the markets they were designing.
But there was something different about Airbnb and some of the other second-generation platforms that were beginning to emerge. In contrast to the anonymity that marked some of the earlier e-commerce platforms, personal profiles were an important part of some of the newer platforms. Take Airbnb, for example, where names and photos were front and center on the platform. In addition, hosts at that time were allowed to reject guests whenever they wanted to, without having to explain why. For a while, Airbnb would penalize hosts for rejecting guests—for example, by placing their properties lower in search results. But after one host’s property was damaged in a high-profile incident, Airbnb removed the penalty and encouraged hosts to reject guests when they felt uncomfortable with them, even though hosts had little information about guests to go on. Over time, versions of the earlier penalty for rejecting guests have been reinstated.
Compare this situation to the travel site Expedia, where property managers (primarily of hotels) simply list room availability, and virtually anyone can book with a credit card. Clearly, Airbnb was bringing dramatic changes to the market. Rather than being a widespread feature of the internet (as suggested by the 1993 New Yorker cartoon), anonymity was becoming a design choice that platforms could make or not make.
With his colleague Ben Edelman, one of us (Mike) wrote a case study about Airbnb with the goal of understanding the platform’s approach to building trust and getting people to feel comfortable allowing strangers into the intimacy of their homes. Initially, we were thinking about the general problem of building trust (this was before we had begun empirical research on discrimination). Through this process, we noticed how prominent personal profiles were on the platform, coupled with the flexibility the site gave hosts to reject users—and we wondered whether this would raise the potential for discrimination that would be difficult to engage in in other marketplaces. Would hosts be unwilling to rent to people of other races and ethnicities? Starting with the landmark Fair Housing Act of 1968, the U.S. government had spent half a century battling discrimination in offline rental markets. Through regulation and enforcement, the efforts had succeeded in reducing rates of discrimination, both in hotels and long-term rentals. Airbnb now raised the prospect of erasing some of these hard-won gains.
The idea of anonymous arm’s-length transactions continued to fade from new e-commerce platforms, and with it, the promise of a more equitable internet age. In 2015, the New Yorker published an updated cartoon, this one by Kamraan Hafeez, featuring the anonymous dogs from the 1993 cartoon. The caption? “Remember when, on the Internet, nobody knew who you were?”
Airbnb’s moral wiggle room
In 2014, Reed Kennedy, a successful tech entrepreneur and investor, found himself being repeatedly rejected on Airbnb. Reed is black, and he suspected that discrimination played a role. His profile includes a photo of himself, so race is likely the first thing hosts see when deciding whether or not he is worthy of staying in their rental properties. Reed reached out to the company with his suspicions about the rejections, thinking they would want to know.
He received an email from a company representative in reply: “I can assure you that the incidents where you’ve been declined by hosts has absolutely nothing to do with your race or ethnicity.” The message continued, “You’ve been doing a great job reaching out to multiple hosts to find the best place suitable for you. I appreciate your concern and I want you to know that Airbnb takes discrimination seriously. If there was any cause for concern, we would have reached out to the hosts immediately. However, hosts have the freedom to decline requests for any reason.”
Those who did have experience with black guests were far less likely to discriminate against them.
Second, the Airbnb rep suggested Reed look for a place that would allow him to book automatically. “I would suggest using our Instant Book feature,” she wrote. “If a listing has Instant Book turned on, you can book it without having to wait for the host’s confirmation.”
She also encouraged him to “reach back out to the hosts once you’ve had a few references created on your behalf.” In the end, the representative offered Reed a $100 voucher for his troubles. But the real coup de grace came in the closing of the email: “I wish you lots of success with booking reservations through Airbnb, Reed. I can tell by your picture that you’re a nice guy.”
By this time, there was plenty of data that was suggestive—but not definitive—of discrimination against hosts by guests. For instance, it was known that that African American hosts were earning less money per night on the platform than white hosts with similar listings. It seemed not much of a leap to expect that there might also be discrimination against African American guests by hosts.
Working with Ben Edelman and Dan Sviersky, one of us (Mike) set out to run an experiment to answer the question at hand: Was discrimination against guests a widespread problem on Airbnb?
The experiment sent rental inquiries to some 6,400 Airbnb hosts in the United States. Building on worth that governments had done in assessing discrimination in offline housing markets, all the inquiries were identical, except for one trait: half were from (fictitious) guests with names that are more common among white people, as determined by birth records (such as Brett and Todd), while the rest were from guests with names that are more common among black people (specifically, names that were statistically more common among African Americans, such as Darnell and Jamal). This was similar to the approach that economists Marianne Bertrand and Sendhil Mullainathan had used to understand discrimination in labor markets in 2001 and 2002, and evocative of work that governments had done in offline housing markets, dating back to at least the 1970s.
The results were striking. Inquiries from guests with distinctively African American sounding names were 16% less likely to get a yes from the hosts than those with white-sounding names. We found discrimination across a range of neighborhoods and listing types, from inexpensive to costly, from separate apartments to guest rooms, and from small-time landlords to larger ones who may have been breaking the law by violating the Fair Housing Act.
The results of our study became public in December 2015. Government lawyers and policymakers began weighing Airbnb’s culpability in the matter. Some users began to wonder whether they really wanted to support a company like Airbnb. The NPR show Hidden Brain aired an episode on the topic, then hosted a Twitter chat in which people shared their stories of discrimination on Airbnb using the hashtag #AirbnbWhileBlack. The Congressional Black Caucus wrote to the CEO of Airbnb, urging the company to take action.
Airbnb hired a task force made up of high-profile civil rights activists. The task force included former attorney general Eric Holder; Laura Murphy, the director of the ACLU’s legislative office; and a number of academics. From a business perspective, it was clear that Airbnb’s design choices had facilitated discrimination and that the company was well positioned to make changes. Now the company needed to decide how to proceed.
Airbnb makes design changes
Airbnb’s task force could have chosen to advocate for one of three broad directions.
At one extreme, Airbnb could continue with the status quo and do nothing to reduce discrimination. At the other extreme, the company could completely eliminate names, pictures, and all other identifying information from user profiles. This option would be sure to purge the platform of most of the discrimination it was facing, but it also imposed risks. Personal profiles offered an easy way to build trust between Airbnb users. Taking them away could sacrifice some of that trust.
The third option would be to keep users’ names and pictures in their profiles, but to implement other changes in the hopes of reducing discrimination. These changes could include making pictures less salient (for example, by making them smaller or putting them in a less prominent location), getting more hosts to adopt Instant Book (which allowed guests to sign up for available dates on the spot), or updating the terms and conditions to more explicitly prohibit discrimination.
The company decided to take the middle road. Hosts could continue to view a guest’s name and picture, and then decide whether to reject the guest. But the company also committed to nibbling around the edges to try to reduce discrimination. The proposals ranged from optional bias training for hosts (though they don’t report how many have taken it) to helping guests who were denied a booking on the basis of race in finding alternative accommodations. The most promising proposal was a commitment to increase the number of hosts who would accept qualified guests without looking at their profile information beforehand. In its report, the company committed to “making one million listings bookable via Instant Book by January 2017.”
Airbnb also thought about discrimination against hosts. To this end, the company ultimately removed the pictures of hosts from the main search results page, which meant that guests needed to click one page down to see a host’s picture. At the time, guest pictures remained very salient on the platform.
Evaluating the changes
Following the experiment, Airbnb also created a data science team to study the issue and to explore potential solutions. For example, they committed to “experiment with reducing the prominence of guest photos in the booking process.” In short, Airbnb would run experiments to test the effects of possible changes designed to reduce discrimination.
While the changes Airbnb made go in the right direction, new research has found evidence of continued discrimination on the platform.
Regulators have also continued to take notice. In the spring of 2017, Airbnb and the state of California reached a deal that will allow the state to conduct tests for discrimination. This will allow the state to identify discrimination on an ongoing basis and step up enforcement where necessary.
While the changes Airbnb made go in the right direction, new research has found evidence of continued discrimination on the platform. Experiments have now documented discrimination on Airbnb against same-sex couples, against guests with Arabic-sounding names, and against guests with disabilities. The company’s design has also continued to evolve, and in 2018, Airbnb announced that hosts would not be able to see the pictures of guests until after the host decided whether or not to accept the guest. From our discussions with employees at the company, it’s clear that there are people working there who are deeply dedicated to reducing bias. Other tech companies have also begun to set up working groups around discrimination. It’s also clear that a lot needs to be done to create a more equitable tech sector.
The evolution of discrimination in online marketplaces also highlights important lessons for companies thinking about how to use experiments to inform managerial decisions.
Airbnb had been running thousands of experiments per year. But they hadn’t been taking into account the potential for discrimination in those experiments. As companies think about using data to inform decisions, experiments run the risk of leading to goals that are too narrow, emphasizing growth and short run profit above all else. But by taking a broader view of metrics (in this case, thinking about the potential for discrimination), experiments also provide new opportunity to have richer discussions around the many goals companies have, and how to evaluate success on each.
Adapted from The Power of Experiments: Decision Making in a Data-Driven World by Michael Luca and Max H. Bazerman. Reprinted with Permission from The MIT PRESS. Copyright 2020.