advertisement
advertisement

The hunt for red tide relies upon AI and retirees

Microscope-equipped iPods and a NASA-funded app allow volunteers to assist in warning beachgoers, tourists, and researchers of harmful algae blooms.

The hunt for red tide relies upon AI and retirees
[Photo: NASA]

Red tide has turned long stretches of Florida’s coastline into a rotting fish carcass dump, and worse. For the past 10 months, the algae bloom has killed off wildlife, kept tourists away, and caused health problems in locals. The damage grew so widespread this summer that Florida’s governor declared a state of emergency last month. A computer vision algorithm could help limit some of the effects by determining red tide hot spots faster than current methods.

advertisement
advertisement

The culprit behind this particular algae bloom is a single-celled dinoflagellate called Karenia brevis, which releases neurotoxins after dying or being broken down by ocean movements. Those neurotoxins can kill animals–it is deadly enough that it’s being blamed for the death of a whale shark–and cause health issues in people. The neurotoxins in K. brevis can also go airborne, causing respiratory problems in beachgoers.

K. brevis cell abundance shown on an ocean color satellite image from the IRIS system. Warmer colors indicate higher levels of chlorophyll, an indicator of algae. Cloudy areas are gray. Circles indicate locations where officials tested water samples on the ground. [Image: FFW/USF/IRIS]
Fortunately, K. brevis has a very distinct swimming pattern, darting quickly in corkscrew-shaped routes. With funding from NASA, researchers with National Oceanic and Atmospheric Administration (NOAA), Mote Marine Laboratory, and Gulf of Mexico Coastal Ocean Observing System (GCOOS) built a tool called HABscope that can spot the cells.

After obtaining a water sample and putting it on a slide, volunteers–all currently retirees who have been trained by Mote Marine Laboratory–take a close-up video, using a $500 setup made up of an iPod, microscope, and a specially designed 3D-printed adaptor that connects the two. The footage is uploaded to the cloud where analysis by a computer vision model detects whether K. brevis cells are in a clip or not. If they are present, HABscope does a count to determine the amount of red tide at that location.

That information, along with other data points like wind conditions and water currents, goes into another model created by NOAA that predicts breathing conditions for the next few days at a particular beach. The sped-up data collection and analysis will allow NOAA to create daily, instead of weekly, alerts, which they hope to roll out later this year. While the warnings don’t do much for sea life, tourism is a huge part of Florida’s economy. Improved forecasts can help people at risk of developing health problems better navigate beach trips.

Chris Holland of NOAA examines a water sample using HABscope, an iPod, and attached microscope. [Photo: courtesy of Bob Currier]
“The idea isn’t to keep people off our beaches,” says Barb Kirkpatrick, HABscope’s co-principal investigator and GCOOS’s executive director. It’s to “find what beach they could go to and keep their family healthy.”

The biggest challenge was gathering data. There was no data set for K. brevis, so HABscope’s engineer Bob Currier had to collect 50,000 images to feed into a model built using the open-source Tensorflow framework.

advertisement

“It’s not like training an algorithm on cat photos,” Currier says. “There were zero images out there.”

HABscope using Google’s TensorFlow deep learning code, implemented under Keras. HABscope now cuts each region of interest from a frame and runs a binary classification test (K. brevis or not) using the output to determine the color of the target marker. Green for K. brevis, red for not. [Animation: Robert Currier]
HABscope could end up doing more than just improving respiratory forecasts. Kirkpatrick said that additional data is suggesting that red tide is “patchy,” and varies from one beach to the next. It’s something that scientists have had a hunch about for years and prove definitely with regular sampling with HABscope.

HABscope could also be trained to spot different algae bloom-causing cells, which have different shapes, vary in sizes, and have unique attributes that computer vision can differentiate. Other communities affected could gather their own data sets, use the technology, and put it to work.

“Moving science forward has to be collaborative efforts,” Kirkpatrick says.

advertisement
advertisement

About the author

Jackie Snow is a multimedia journalist published in or on National Geographic, Vanity Fair, The Atlantic, Quartz, New York Observer and more.

More