Machine learning (ML) and AI can be used to diagnose breast cancer, surveil large populations, or help make riding a motorcycle safer, but there’s another seemingly mundane task to which both are well-suited: predicting the weather. Google says its using ML to “nowcast” precipitation five or 10 minutes ahead of time, instead of the hours required by conventional weather modeling systems.
In a blog post on Monday, a senior software engineer at Google Research, Jason Hickey, says using machine learning systems trained on existing data significantly reduces the computation demands compared to traditional solutions, while also generating high-resolution forecasts.
Why does this matter? — Because if a conventional forecast takes six hours to compute it’s only possible to do a handful of them each day, and they can’t predict potentially catastrophic events before they happen. Further, exiting computational limits mean forecasts often have a resolution of around 5km (compared to Google’s 1km), which is insufficiently precise for dense urban areas.
To train its model, Google used weather data for the continental U.S. collected between 2017 and 2019. It split the data into four-week sections, using the first three weeks of each for training and the last to assess the quality of the model’s predictions.
Google compared the output from its model to that of three, major existing models, and it outperformed them every time. In the short-term, at least. The High-Resolution Rapid Refresh (HRRR) model outperformed Google’s on longer timelines. But given Google’s aim is shorter-term predictions and its system far computationally simpler, that’s both reasonable and to be expected.
If you’re interested in reading the whole research paper, you’ll find it over here.