Just when you thought biases were a completely human construct, more evidence suggests that both algorithms and interfaces could be biased, too.
The latest example of this is from a study conducted by researchers from University of Washington and University of Maryland and reveals how a gender bias is working its way through web searches when people look for images to represent careers and jobs.
First, they did a comparative analysis to see if the prevalence of men and women in image search results for professions actually correspond to their representation in actual professions. The researchers did this by comparing the number of women who appeared in the top 100 Google image search results in July 2013 for 45 different occupations, which ranged from bartender to chemist to welder, with 2012 U.S. Bureau of Labor statistics of how many women actually worked in those fields. Then, they did a qualitative analysis to see how men and women are portrayed in the image results.
The goal was to answer some compelling questions:
- Are there systemic over- or under-representations of women in preferred results?
- Do biased image search results lead people to perpetuate a bias in image search results when they choose images to represent a profession (i.e. through stereotype exaggeration)?
- Do differences in representation in image search results affect viewers’ perceptions of the prevalence of men and women in that occupation?
- Can we shift those opinions by manipulating results?
The answers were equally compelling. For instance, according to their study, more than half of U.S. authors are women (56%), yet the image search shows only about 25% women authors.
On the flip side is telemarketing, an industry where men and women are equally represented, but the Google image results would have you believe that 64% of telemarketers are female.
Not all the results were so skewed. The research uncovered that, in nearly half of the professions, the actual gender representation and the image search numbers were within 5 percentage points of each other.
How men and women looked in those images was another story. When the researchers asked participants to rate professionalism, images showing a person who matched the majority gender for the job was viewed as more competent, professional, and trustworthy. Those who didn’t match were rated provocative or inappropriate.
“A number of the top hits depicting women as construction workers are models in skimpy little costumes with a hard hat posing suggestively on a jackhammer. You get things that nobody would take as professional,” says Cynthia Matuszek, a co-author of the study.
None of this would matter if people wouldn’t then be nudged into making assumptions about men and women in particular roles in the real world. However, when the researchers manipulated the search results, not surprisingly, participants’ opinions changed to conform with stereotypes. Though they stressed that this was just a short-term observation, other research bears out that incremental exposure to these images over time will contribute to unconscious bias.
It has also already been revealed that Wikipedia’s entries–a supposed bastion of diversity and editorial neutrality–skew heavily towards men in both actual articles as well as within links. Articles about women tended to be linked to those about men.
Part of this is due to Wikipedia’s community, the preponderance of which are educated men, who are English-speaking and hail from mostly Christian countries.
In addition to the image searches being gender biased in some cases, Google’s also been taken to task for lack of diversity within its ranks and even disproportionately using white men in its doodles.
While Google may not be aware of the results of this latest study and the researchers’ recommendation, the search giant did recognize that its tough for anyone–even its own cadre of emotionally intelligent staff–to process the 11 million bits of information that we are bombarded with at any given moment and focus instead on finding out what biases might spring from them.
The researchers of this study hope that the information will influence designers of search engines to create algorithms that more accurately represent reality. Sean Munson, UW assistant professor of human-centered design and engineering and a coauthor of the study says: “[Search engine designers] may come to a range of conclusions, but I would feel better if people are at least aware of the consequences and are making conscious choices around them.”