Science

How NOAA Went From Hopeless to High Tech

Hurricane Sandy was a disaster. Their new supercomputer isn't.

In 2012, when Hurricane Sandy was barreling down on the East Coast, the National Oceanic and Atmospheric Administration dropped the ball.

Their weather-predicting supercomputer, known as the Global Forecasting System, was hideously out of date, far behind the capabilities of the European Center for Medium-range Weather Forecasting, which correctly predicted the storm. When Sandy took an abrupt left turn and headed straight for the Eastern seaboard, the NOAA undersold the storm’s danger, thanks mostly to poor predictions from its weather computers. The storm wreaked havoc across New York, New Jersey, and other heavily populated areas, and the public called for a change.

Now, NOAA has a new computer, a massive, school-bus sized machine capable of making 3 quadrillion calculations per second — roughly the processing power of 12,500 high-end laptops duct taped together (note: taping laptops together does not actually increase their speed). After Sandy, Congress approved a huge budget increase to NOAA in emergency funds, and before they were finished, they’d spent $44.5 million to bring the U.S. system up to par with the ECMWF.

Weather forecasting requires supercomputers because of the massive amount of atmospheric data and variables that come in, making it so information-dense that only the fastest computing technology known to mankind can handle crunching all the numbers. NOAA’s new machine was built by Cray Systems, a Seattle-based supercomputer manufacturer. David Michaud, director of the office of central processing at the National Weather Service, told USA Today that the NOAA computer systems process more than twice as much information as the entire printed archives of the Library of Congress every day.

The primary unit, known as “Luna,” is in Reston, Virginia, but in case Mother Nature gets personal and decides to shoot the moon, Luna has a back-up unit named “Surge” in Orlando, Florida. Together, Luna and Surge can process about 5.78 “petaflops” per second, or about 5.78 quadrillion calculations (Luna’s 3 million per second estimation is a little over half the total), which puts them roughly on par with the “Hazel Hen” supercomputer in Germany, another Cray creation.

Cliff Mass, a meteorologist at the University of Washington, was one of the most vocal early critics of the outdated system. Mass said Luna and Surge were “huge improvement over what they had,” according to USA Today.

Still, Luna and Surge are but a fraction of some of the world’s largest supercomputers. The Chinese Tianhe-2, which is being used for “simulation, analysis, and government security applications” by the Chinese government, clocks in at a staggering 33.9 petaflops, and the American Titan, used for scientific research, can handle 17.9 petaflops. President Obama wants to build an even bigger one, but the NOAA will have to be content with what they’ve got for the time being, as Luna and Storm are the only supercomputers fully-devoted to weather research.

Related Tags