Supercomputer aids thunderstorm predicting

But a research team from the University of Oklahoma and the US Federal Government is poised to dramatically improve weather forecasting with supercomputer analyses of the individual cells that make up severe thunderstorms and tornadoes.

The numerical weather predictions widely used today suffer from coarse resolution, focusing on geographical areas of 10 kilometres or more, says Ming Xue, director of Oklahoma's Center for Analysis and Prediction of Storms (CAPS).

Greater computational power than is generally available to forecasters is necessary to observe the progress of individual storm cells, which can be as small as a few kilometres. Major storm systems are composed of many such cells.

"Without such resolution, you can't really tell whether you will get thunderstorms or not," Xue says. A typical forecast "does not explicitly predict individual cells. From the routine forecasts all we get is three-hour accumulated precipitation. You can't really tell within the three hours when the precipitation actually falls."

CAPS has teamed with the National Oceanic & Atmospheric Administration (NOAA) to run analyses of 2-kilometre areas throughout two-thirds of the United States, with a Cray XT supercomputer at the Pittsburgh Supercomputing Center.

They run 10 models simultaneously in what is known as ensemble forecasting, which uses the average of numerous models to lessen the impact of uncertainties and errors within forecasting. The forecasts, which lasted from mid-April through early June, were the first in which Xue's team applied ensemble forecasting to individual storm cells.

The analyses last eight hours and are able to predict weather 33 hours into the future.

Eventually, the strategies CAPS and NOAA have developed could improve predictions of all types of weather. The biggest roadblock to widespread adoption is availability of computing power.

"In order to do what we did, we needed 700 processors to run overnight," Xue says. "The actual routine availability of this operationally is more than five years away."

Xue was satisfied by the monitoring of individual cells two-thirds of the time. Forecasts will be more accurate next year, when his researchers begin using radar data, as opposed to information from weather balloons, satellites and aircraft.

"We are the group who is best at using radar data, but it's a very significant effort. This year is the first year, so it took a lot of setup," Xue says.

The University of Oklahoma and the NOAA have enough funding to continue the experiments over the next two spring storm and tornado seasons, he says.

Each day during the experiment, trillions of bytes of data were generated, archived and transferred from the Pittsburgh Supercomputing Center to the National Weather Center in Oklahoma.

Join the newsletter!

Error: Please check your email address.

More about CrayHIS

Show Comments
[]