New supercomputers to improve forecast accuracy

Published 12:11 pm Saturday, July 2, 2022

By Skip Rigney

Last week’s scorching heat wave culminated last Friday and Saturday (June 24-25) with high temperatures of 100 and 99 at Picayune Municipal Airport and 98 and 99 at the Poplarville Experiment Station. Fortunately, as predicted by NOAA’s National Weather Service (NWS), the massive high pressure system in the middle levels of the atmosphere responsible for the unusual heat began to break down on Sunday, ushering in a week of increased clouds and showers in the Gulf South and a return to more typical daytime temperatures topping out near 90 degrees.

How were forecasters able to accurately predict the end of the heat wave several days in advance? The key, as with most modern weather forecasting, was the accuracy of predictions from computer weather models.

Sign up for our daily email newsletter

Get the latest news sent to your inbox

The basic physics equations that describe atmospheric motion have been known since the 1800s, although scientists have continued to refine them and add new equations for processes such as precipitation and energy transfer. The number of calculations needed to solve the equations and predict future states of the atmosphere remained impossibly large until the advent of digital computers in the 1950s.

Meteorologists who write computer software to build virtual models of the atmosphere have taken advantage of increasingly speedy computers to gradually improve their various models. Better weather observations, including from satellites, are input to the models and have been essential to increased accuracy. But without the incredible growth in computer speeds and storage capacity over the past 50 years, the improvements in model predictions would have been modest at best.

That’s why the news from the NWS this week warmed the hearts of professional forecasters and amateur weather geeks alike. On Wednesday the NWS made its first operational calculations on two new supercomputers that are three times faster than NWS’s previous system.

The twin supercomputers, one in Arizona named Cactus and the other in Virginia dubbed Dogwood, are capable of computing at speeds of 12.1 petaflops. A single petaflop is one quadrillion (that’s one thousand trillion) calculations per second.

Although the new systems are capable of generating model predictions faster, that’s not the main goal. Instead the primary objectives are improved accuracy and better estimates of how much confidence forecasters can have in that day’s model runs.

According to the press release announcing the supercomputing milestone, “Enhanced computing and storage capacity will allow NOAA to deploy higher-resolution models to better capture small-scale features like severe thunderstorms, more realistic model physics to better capture the formation of clouds and precipitation, and a larger number of individual model simulations to better quantify model certainty. The end result is even better forecasts and warnings to support public safety and the national economy.”

By this fall an improved version of the NWS’s Global Forecast System (GFS) will be running on Cactus and Dogwood. Forecasters, including the NWS meteorologists in Slidell who issue forecasts and warnings for our local area as well as National Hurricane Center forecasters, rely heavily on the GFS.

Although not available for this year’s hurricane season, by 2023 a new hurricane forecast model called the Hurricane Analysis and Forecast System (HAFS) should be operational on the new computer systems.

The next time I’m annoyed by the amount of federal income tax I’m paying, I’ll try to remember that I’m contributing to better weather forecasts and warnings.