How do you assess the impact of a large oil spill? Not easily, says Sabrina McCormick. If it’s like the Gulf disaster, the oil will have spread far and wide, and the effects will be lingering and insidious. It’ll take scientists years to take readings, and even then, their conclusions are likely to miss a lot.
McCormick, an associate professor at George Washington University, has studied the scientific response to the Deepwater Horizon disaster. She says most reports tend to downplay the effects on the environment and human health because of a lack of good data for many areas.
Instead of traditional data-gathering, McCormick thinks disaster responders should turn to citizen-scientists, who can provide more timely, granular information. “The data collected through crowd-sourcing is generally more time-sensitive. It’s able to fill a gap in the data-collection process that otherwise wouldn’t get filled,” she says.
“Even three months after the disaster, experts aren’t on the ground. Citizen scientists are able to collect a wide variety of data that experts can’t get to, because it takes a long time to get up a new piece of research, or plan a project, or even get to a location.”
In a paper published in the journal Ecology and Society, McCormick compares the official account of Deepwater with that provided by various local citizen-scientists. Much of the information included in 2,600 citizen reports never made it into studies by the U.S. Environmental Protection Agency, and others, she says.
For example, the Louisiana Bucket Brigade (LABB), a local environment group, collected thousands of data-points for its iWitness project. The data comes from people in the region, who are equipped with air “buckets”: little bags that suck in sample air using a little motor. The samples are sent to a lab, then the results are posted on the map. Other groups took readings of Gulf sediment, looking for traces of oil and chemical dispersant (over 2.1 million gallons of Corexit was pumped into the ocean).
Citizens “naturally recognize patterns, often in a local, immediate way that is more accurate than individuals recollecting the past, and more precise than judging impacts from a distance,” says McCormick, in the study.
McCormick is not saying traditional science is useless, just that it tends to take a long time, and is inevitably impressionistic. Citizen science can improve the understanding of responders to the wider problem, including those neighborhoods they may not know about.
The biggest issue is to verify the data, McCormick says. “The oil might not be oil from the spill, and it may not be oil, or they may say they feel sick or dizzy and just have flu, or something. The question is, ‘How do we make crowdsourced data more reliable?'”
McCormick says one solution is to take the human element out of the crowd. Data-collection could be done via smartphone (fitted with a device) that would sense air quality, say, remotely.
“There are various sensors and new technologies that you can add to smartphones, so you can sense different things and upload to a crowd-sourcing interface. That might be more reliable than ‘this is what I smell and this is what I see.'”