The shifting landscape of numerical weather prediction
I work at a government-funded institution that delivers weather forecasts to the public, using a combination of numerical modelling and observational data. Traditionally, for the past 40 or 50 years, numerical weather models have been developed through collaborations between academia and local weather services. In Europe, examples of such collaborations include the COSMO and ACCORD (formerly HIRLAM) consortia. This has led to a so-called "quiet revolution" in the skill of numerical weather models, that has been slowly but surely increasing for the last few decades
Unlike American models, which are fully open source (such as WRF and MPAS), most European models are closed source and accessible only to partnering institutions. However, the recent surge in data-driven AI models is dramatically reshaping this landscape, as the new AI models are typically open source and require far fewer computational resources than traditional physical models. I believe that the open source nature of many of this models has contributed to the massive increase in data-driven models for the weather. The skill of these models, on the other hand, seem to be increasing at quite a fast rate, in comparison to the more traditional physically-based models mentioned above. A good overview of where the models are in terms of general scores can be found in Stephan Rasps Excel Sheet, updated regularly here
I remember attending the EWGLAM conference a bit over 2 years ago where people from all major weather institutes were present. There was only one presentation on data driven/AI models by a researcher from NVIDIA (Stan Posey). At the time I had the impression that most researchers working on NWP were not too interested on the talk, since it was a global model with a relatively coarse resolution, trained on ERA5 data and produced half a dozen variables as output. The presentation received only one question by the audience. More than two years later, data-driven models based on graphs, diffusion or transformers architectures are quite common. This page lists more than a dozen and the list keeps growing. Very few of these models are from public institutions, and most are from private companies like Google, IBM and Microsoft.
Besides these models there is dozens of new companies out there that are developing their own models. All of them offer some type of general solution based on AI that seems to aim to encompass all possible scales. For example, tomorrow.io offers a variety of solutions for weather or climate-related events using AI models. A similar offering is proposed by myzeus.ai and sirulian.ai. These fall in the general "Earth Intelligence" category. A swiss company called jua.ai uses a foundation model for weather prediction, which seem to be mainly focused on solar forecasting and renewables. Finally, worldsphere.ai, seem to have a more specialized focus on risk, especially from the effects of extreme events like hurricanes, using a diffusion based model.
Not all the landscape of new data driven models come from private companies though. Public institutions are like the ECMWF are rapidly catching up with the new developments, having released their own AI version of the IFS model, already running operationally. A good discussion can be found here.
It seems like the future of weather forecasting is not going to be mainly on the hands of a few public governmental and academic institutions. It might end being a mix of private and public collaborations, as recently speculated in this interesting discussion by Peter Bauer. In any case, given the fast pace of developments in this area, it is also entirely possible that public weather services might end becoming perhaps not entirely obsolete, but very different from what they are today. The recent emergence of cheaper and faster models like deepseek-R1 seem to be pointing in that general direction.