Generated by GPT-5-mini| Integrated Forecasting System | |
|---|---|
| Name | Integrated Forecasting System |
| Developer | ECMWF |
| Initial release | 1990s |
| Latest release | continuous development |
| Programming language | Fortran, C |
| Operating system | Unix, Linux |
| License | proprietary with partner agreements |
Integrated Forecasting System
The Integrated Forecasting System is an operational numerical weather prediction platform developed and maintained by the ECMWF that provides global and regional atmospheric, oceanic, and coupled forecasts. It supports a wide range of services for organizations such as the Met Office, NOAA, WMO, EU agencies and national meteorological services including Météo-France and Deutscher Wetterdienst. The system integrates model dynamics, data assimilation, ensemble forecasting and post-processing used by stakeholders like the ICAO, ESA, and the UN.
The system comprises dynamical core models, coupled components, ensemble systems, and operational workflows maintained at Reading headquarters of ECMWF and coordinated with partner centres including Met Éireann, IPMA, and KNMI. It delivers products via portals accessed by institutions such as ECMWF member states, the EFAS, the Copernicus services, and research projects funded by the Horizon 2020 and Horizon Europe programmes. The architecture emphasizes reproducibility, high-performance computing on systems like those provided by Atos, Cray, and cloud resources from AWS and national supercomputing centres.
Development began in the late 20th century as part of ECMWF’s mission established by the ECMWF Convention and evolved alongside milestones such as the adoption of spectral models in the 1970s and the operationalization of ensemble prediction introduced by Edward N. Lorenz concepts and operational ensembles at ECMWF in the 1990s. Collaborations with institutions including NCAR, Max Planck Institute, and Met Office drove advances in data assimilation inspired by techniques from Lewis Fry Richardson and algorithmic work by researchers associated with Rudolf Kalman. Major upgrades coincided with advances in satellite observing systems from NOAA and EUMETSAT missions, and with international initiatives like the Global Observing System.
The system’s architecture integrates a spectral dynamical core, finite-volume components, land-surface modules, ocean models, sea-ice models and wave models. Core components interoperate with coupled systems such as the NEMO ocean model and the LIM sea-ice framework, plus parametrizations developed in partnership with INGV and Scripps. Operational orchestration uses workflow managers compatible with HPC schedulers from vendors like IBM and HPE, and software frameworks influenced by projects at ESA and JPL.
Observational inputs include data from polar-orbiting and geostationary satellites such as METOP, GOES, and Meteosat instruments, radiosonde networks coordinated by WMO, aircraft reports from IATA operations, buoy data from Global Drifter Program and marine observing systems, and remote sensing from missions by ESA and NASA. Data assimilation methods build on 4D-Var and hybrid ensemble-variational schemes developed in collaboration with Princeton and Reading researchers; these techniques assimilate observations from platforms like COSMIC and GRACE and integrate quality control protocols used by ECMWF and national centres.
Forecasting employs deterministic and ensemble prediction systems that leverage stochastic perturbation schemes, model uncertainty parameterizations, and multi-model post-processing techniques influenced by studies at University of Washington, Tellus-B, and research groups at Imperial College London. Dynamical cores are augmented by advanced microphysics, convection schemes, and radiation schemes informed by community models like the IFS family and external models such as WRF and GFS for intercomparison. Probabilistic forecasting outputs use methods from statistical post-processing, bias correction, and machine learning collaborations with institutions including ETH Zurich and MIT.
Outputs support aviation planning by organizations such as ICAO and airlines, hydrological forecasting for agencies like EC flood services and national river authorities, emergency management by UN OCHA and civil protection agencies, renewable energy forecasting for offshore wind operators, and climate reanalysis projects used by IPCC assessments and academic research at Columbia and UCL. The system underpins Copernicus Climate Change Service products managed by C3S and supports numerical guidance used in sporting events and maritime operations coordinated with IMO.
Verification protocols follow WMO guidelines and use metrics developed with partners such as ECMWF verification teams, UK Met Office scientists, and research consortia at KIT. Limitations include sensitivity to observation coverage variations from polar regions, biases tied to model parametrizations studied at Bergen and Reading, computational constraints on resolution dependent on supercomputing resources from centres like CINECA and PRACE, and challenges in coupled feedback representation highlighted by researchers at Scripps Institution of Oceanography and Max Planck Institute for Meteorology. Continuous development leverages community collaboration through forums such as WMO panels, ECMWF workshops, and international research programmes.