While the bulk of the funding will be used to buy networking equipment and services, the cash could also be used to speed up climate change research and ultimately increase funding for preventative measures. The Berkeley team hopes to eventually build a network capable of moving 1 terabit (1,000 gigabits) of data per second. That’s fast enough to accommodate the “next generation” climate modeling data archive, which will reach up to 650 terabytes thanks to massive computer models of storm and hurricane patterns. The current climate modeling data archive at Lawrence Livermore National Laboratory tops off at 35 terabytes.
Building new Ethernet networks is one way to quickly process massive amounts of data, but it’s not the only way. Intel’s Climateprediction.net program uses the power of distributed computing (i.e. data processing power from idling computers) to work on climate change models. The more people get involved in the project, the faster scientists can track weather patterns and figure out the severity of upcoming climate change.