Amazon Mechanical Turk (MTurk for short), an online marketplace for digital piece-work, is not somewhere you’re going to get rich any time soon. Most of its tasks are miniscule, and so are the wages–usually measured in cents. But one of its most intriguing possibilities that it could provide employment for some of the world’s poorest people.
However, a study by Microsoft researchers shows that MTurk’s user interface is too complicated for those with limited computer literacy to navigate. The researchers also suggest a new, more navigable interface that could fulfill MTurk’s promise as a means to elevate the situation of global low-income workers.
Amazon calls these micro-jobs “HITs” (human intelligence tasks), and bills the site as offering “artificial artificial intelligence”–jobs that seem like the automated stuff a computer would normally do, but that are best done by humans. The jobs are, by many standards, fairly mind-numbing–earn $0.02 for copying text from a business card, or earn a penny by taking a brief poll.
But by executing the same task many times, you can rack up a few bucks in a couple of hours. That’s hardly chump change if you’re living in poverty in an Indian slum. The microtasks posted on MTurk dole out about $2,000 per day, and the idea has been successful enough that at least 50 other companies are said to be developing similar online task marketplaces.
But what the Microsoft researchers found (PDF) is that MTurk might not be helping the workers who could most use it. The researchers first observed seven low-income workers attempting to use MTurk. They were not uneducated. They had had an average of 11 years of education, and some knowledge of English, though most had been schooled in an Indian language. They had basic IT skills.
But none of them could quite figure out MTurk. The researchers had them try image labeling tasks (putting a bounding box around a human, for example), verifying addresses, and decoding CAPTCHAs (those anti-spammer boxes featuring distorted letters). They struggled with the first, failed at the second, and didn’t do so well at the third.
The researchers observed the workers as they plodded along, and decided to design a solution.
The Microsoft interface was much simpler. It used clear, illustrated instructions for each task, which was divided into numbered steps. It cleared out a lot of the visual junk that had distracted the workers, eliminating a complex banner of unrelated links. And it fixed a glitch where hitting the backspace key while outside of a text box reset the entire task.
Finally, the Microsoft team tried out the new design on a fresh 49 participants in two locations in Bangalore. They assigned an image annotation task, hundreds of which are on MTurk at any moment. None of the workers were able to execute a single task correctly with the Amazon interface, but they managed to get it right 66% of the time, on average, with the simplified interface.
The study concludes: “There exist tasks on MTurk for which the primary barrier to low-income workers is not the cognitive load of the work itself; rather, workers are unable to understand and navigate the tasks due to shortcomings in the user interface.”
The paper is at once a criticism and praise of MTurk. Given that its interface is so problematic, MTurk’s success to date is all the more remarkable. And as soon as MTurk and competing sites take the lessons of the Microsoft team to heart, making their interfaces more intuitive, microtasks could begin to reach a much broader base of low-income workers, elevating conditions abroad and serving more companies in need of “artificial artificial intelligence.”
Here’s hoping this doesn’t just turn MTurk into a more productive spam engine. But that’s another story.
[Image via Wikipedia]