Climate change has a NIMBY problem. That’s short for “not in my backyard,” and while the threat of a warming world may finally be getting more social and political traction than ever, for most people it’s still something that happens far away, whether it’s at polar ice caps or on distant islands.
That’s why some AI researchers from Montreal decided to create a system for generating hyper-personalized visuals about the impact climate change is likely to have in 50 years. How personalized? Try the most literally “IMBY” visual you can imagine: a picture of your own house, flooded out by rising sea levels.
“It is difficult for people to mentally simulate the complex and probabilistic effects of climate change,” the researchers write. To drive the reality home, they imagined an interactive tool that could automagically combine imagery from Google Street View with a photorealistic rendering of those “complex and probabilistic effects,” based on the best and most updated climate models. It would be like a climate-change version of NUKEMAP, the online tool that uses Google Maps to let you see what various kinds of nuclear weapons would do to your hometown. But by using actual photographs from Google Street View, the researchers hope that the effect will create “a more visceral understanding of the effects of climate change, while maintaining scientific credibility.”
Interactively modifying photos in real time is a tall order, which is why the researchers turned to machine learning. They built a proof-of-concept system that uses a pair of trained neural networks to intelligently replace the front yard of a house with an expanse of water, in a way that accurately reflects flooding forecasts for the year 2050 based on a global temperature rise of 4.5 ◦C. (That’s three degrees past the threshold scientists hope will keep catastrophic climate change at bay.) The result is a prototype for generating before-and-after visualizations of any address on Google Street View.
Because it’s still a proof of concept, the system isn’t live yet. (It also only generates low-resolution pictures, because higher-res images would take the neural networks much longer to train with.)
According to Alexandra Luccioni, one of the lead authors on the paper demonstrating the system, the team hopes to have a working version in September. “We are working on getting it to raise the level of water, so that it actually floods the house instead of just surrounding it,” she says. This more fleshed-out version won’t just replace grass with water–instead, it will “incorporate other climate-related events (fires and droughts, etc.), varying time horizons, and ‘decision knobs’ allowing the viewer to choose actions and make decisions and see their impact on the projected consequences of climate change,” the researchers write.
But even if the machine learning models can be trained up, turning scientifically accurate climate projections into understandable visuals still isn’t a straightforward process. “Typically, meteorological climate models output probabilities of various variables like humidity and wind speed,” Luccioni explains. “What does 2mm of rain translate into, visually? It’s a matter of defining the right thresholds to realistically represent what a given quantity of water will result in at a given place.”
For maximum visceral impact, a scientifically accurate time-lapse transformation from the present-day “normal” Street View of your home to a forecasted future version might be even better than a before-and-after still-photo swap. That will take even more AI magic than the researchers are currently working with. But just imagine if you could generate your own personal, GIF-ready version of An Inconvenient Truth on command. Whatever visualization these researchers end up pursuing, it’s heartening to know they’re facing another inconvenient truth about head on: that seeing is believing.