Meteorologists use shapes and movements of clouds in satellite images as indicators of several noteworthy types of severe storms. However, as satellite image data are in progressively higher resolution, both spatially and temporally, meteorologists can’t completely use the data in their forecasts.
Now, a team of scientists at Penn State, AccuWeather, Inc., and the University of Almería in Spain has developed a computer model that can help weather forecasters to perceive the potential of severe storms all the more rapidly and precisely. The computer model is actually based on machine learning linear classifiers that identify rotational movements in clouds from satellite images that may have otherwise gone unnoticed.
During the study, scientists worked with Wistar and other AccuWeather meteorologists to analyze more than 50,000 historical U.S. weather satellite images. In them, experts identified and labeled the shape and motion of ‘comma-shaped’ clouds. These cloud patterns are strongly associated with cyclone formations, which can lead to severe weather events including hail, thunderstorms, high winds, and blizzards.
Then by using machine learning algorithms, scientists taught computers to automatically recognize and detect comma-shaped clouds in satellite images. The computers can then assist experts by pointing out in a real-time where, in an ocean of data, could they focus their attention in order to detect the onset of severe weather.
Steve Wistar, a senior forensic meteorologist at AccuWeather, said, “Having this tool to point his eye toward potentially threatening formations could help him to make a better forecast.”
“The very best forecasting incorporates as much data as possible. There’s so much to take in, as the atmosphere is infinitely complex. By using the models and the data we have [in front of us], we’re taking a snapshot of the most complete look of the atmosphere.”
With 99% accuracy, the system effectively detected comma-shaped clouds at an average of 40 seconds per prediction. It was also able to predict 64 percent of severe weather events, outperforming other existing severe-weather detection methods.
Rachel Zheng, a doctoral student in the College of Information Sciences and Technology at Penn State said, “Our method can capture most human-labeled, comma-shaped clouds. Moreover, our method can detect some comma-shaped clouds before they are fully formed, and our detections are sometimes earlier than human eye recognition.”
Wistar said, “The calling of our business is to save lives and protect property. The more advanced notice to people that would be affected by a storm, the better we’re providing that service. We’re trying to get the best information out as early as possible.”
Wang said, “We recognized when our collaboration began [with AccuWeather in 2010] that a significant challenge facing meteorologists and climatologists was in making sense of the vast and continually increasing amount of data generated by Earth observation satellites, radars, and sensor networks.”
“It is essential to have computerized systems analyze and learn from the data so we can provide a timely and proper interpretation of the data in time-sensitive applications such as severe-weather forecasting.”
“This research is an early attempt to show the feasibility of artificial intelligence-based interpretation of weather-related visual information to the research community. More research to integrate this approach with existing numerical weather-prediction models and other simulation models will likely make the weather forecast more accurate and useful to people.”
In addition to Zheng, Wang and Wistar, the research team included Yukun Chen, doctoral student in the College of IST; Jianbo Ye, former doctoral student in the College of IST and current applied scientist at Amazon Lab 126; Jia Li, professor of statistics in Penn State’s Eberly College of Science; Jose Piedra-Fernandez, collaborating faculty member at the University of Almería, and Michael Steinberg, senior vice president at AccuWeather, Inc.
The researchers’ work was supported in part by the National Science Foundation, the Amazon AWS Cloud Credits for Research Program, and the NVIDIA Corporation’s GPU Grant Program, and was published in the June 6, 2019, issue of IEEE Transactions on Geoscience and Remote Sensing.