Featured Articles

Nvidia Shield 2 shows up in AnTuTu

Nvidia Shield 2 shows up in AnTuTu

Nvidia’s original Shield console launched last summer to mixed reviews. It went on sale in the US and so far Nvidia…

More...
AMD CSO John Byrne talks ARM

AMD CSO John Byrne talks ARM

We had a chance to talk about AMD’s upcoming products with John Byrne, Chief Sales Officer, AMD. We covered a number…

More...
AMD Chief Sales Officer thinks GPU leadership is critical

AMD Chief Sales Officer thinks GPU leadership is critical

We had a chance to talk to John Byrne who spent the last two years as Senior Vice President and Chief…

More...
OpenPlus One $299 5.5-inch Full HD phone

OpenPlus One $299 5.5-inch Full HD phone

OnePlus is one of the few small companies that might disrupt the Android phone market, dominated by giant outfits like Samsung.…

More...
KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 gained a lot of overclocking experience with the GTX 780 Hall of Fame (HOF), which we had a chance to…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Wednesday, 12 September 2007 07:18

Supercomputer helps improve storm forecasting

Written by David Stellmack

Image

Focus on individual storm cells


 

Although official weather records have been recorded for at least the past 150 years, weather prediction for thunderstorm activity is still not that reliable. 

The University of Oklahoma’s Center for Analysis and Prediction of Storms (CAPS) has team up with the National Oceanic & Atmospheric Administration (NOAA) using an IBM Cray supercomputer at the Pittsburgh Supercomputing Center in Pittsburgh, Pennsylvania to improve weather forecasting of storms by using supercomputer analyses of the individual cells that are part of tornadoes and severe thunderstorms.

Numerical weather predictions are not precise because of the reliance on storm geographical areas of 10 Km or larger, which provides only a coarse resolution of storm activity. Focusing on individual cells and using a supercomputer for computational analysis of these thousands of cells gives a more precise analysis of the storm’s intensity.

CAPS and NOAA are using the Cray XT supercomputer to analyze 2-Km geographical areas in various areas of the U.S. by running ten models for “ensemble forecasting” to help eliminate errors and ensure more accurate data collection. The data is being collected and archived at the National Weather Center in Norman, Oklahoma.

Read more here.

Last modified on Wednesday, 12 September 2007 10:42

David Stellmack

E-mail: This e-mail address is being protected from spambots. You need JavaScript enabled to view it
blog comments powered by Disqus

To be able to post comments please log-in with Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments