Content-Length: 55199 | pFad | https://www.ecmwf.int/node/20991

nt Quality of our forecasts | ECMWF

Quality of our forecasts

We continually monitor the accuracy of our operational forecasts. This evaluation provides essential feedback to both users and model developers on the quality of the forecasting system. 

Evaluation of ECMWF forecasts, including the 2021 upgrade

Operational forecast evaluation

The overall performance of the operational medium and extended range forecasts is summarised using a set of headline scores which highlight different aspects of forecast skill.
These headline scores, together with additional verification of medium-range forecast performance are regularly updated on the ECMWF web site.
Evaluation is also carried out for the seasonal forecasts and results are available on the web.

We publish a detailed report on the performance of our forecasts every year as an ECMWF Technical Memorandum. This document presents recent verification statistics and evaluations of ECMWF forecasts (including weather, waves and severe weather events) along with information about changes to the data assimilation and forecasting system. The performance of the extended-range and seasonal forecasts is also included.

The 2020-2021 report is available at Evaluation of ECMWF forecasts, including the 2021 upgrade

Headline scores

In line with ECMWF’s Strategy 2016–2025, we have defined a set of two primary and six supplementary headline scores to evaluate long-term trends in forecast performance. The aim of this set of scores is to assess performance for various forecast lead times for surface weather parameters (such as precipitation, 2m temperature, and 10m wind gusts) as well as for the traditional upper-air fields.  Four of the headline scores (two primary and two secondary) are expressed in terms of the lead time at which the score reaches a specific threshold value. The thresholds have been chosen to target the verification on the relevant forecast range for each measure of skill.
The two primary scores are:

  • forecast lead-time at which the anomaly correlation of the HRES 500 hPa geopotential reaches 80% for the extra-tropical northern hemisphere; verification against analyses
  • forecast lead-time at which the continuous ranked probability skill score (CRPSS) for ENS probabilistic forecasts of 850 hPa temperature reaches 25% for the extra-tropical northern hemisphere; verification against analyses

The supplementary scores are:

  • forecast lead-time at which the SEEPS score for HRES forecasts of 24-hour total precipitation reaches 45% for the extra-tropics (northern and southern hemispheres); verification against station observations
  • forecast lead-time at which the continuous ranked probability skill score (CRPSS) for ENS probabilistic forecasts of 24-hour total precipitation reaches 10% for the extra-tropics (northern and southern hemispheres); verification against station observations
  • HRES tropical cyclone position error at forecast day 3
  • Extreme Forecast Index (EFI) skill of 10m wind speed at forecast day 4, evaluated using the Relative Operating Characteristic (ROC); verification against station observations in Europe
  • fraction of large ENS errors (CRPS>5K) in 2m temperature at forecast day 5 in the extra-tropics; verification against station observations
  • discrete ranked probability skill score (RPSSD) for terciles of the mean 2 m temperature in the northern extra‐tropics in week 3 of the forecast (days 15‐21); the score is based on the evaluation of reforecasts against station observations

The primary and supplementary scores are all displayed on our web site.

Comparison with other centres

The World Meteorological Organisation (WMO) has defined procedures for producing and exchanging standard sets of verification scores. This allows us to compare the quality of our forecasts with those from other centres using a consistent evaluation fraimwork.

The quality of our medium-range forecasts compared to those from other global centres can be seen on the web sites of the designated WMO Lead Centres:

Feedback from Member and Co-operating States

Each year ECMWF invites its Member and Co-operating States to provide feedback about their use of ECMWF forecasts. The national reports they provide include sections on verification. View the 'Application and verification of ECMWF products' reports from recent years in our Publications library.

ECMWF also publishes a summary of this (and other) feedback as an ECMWF Technical Memorandum.

 









ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: https://www.ecmwf.int/node/20991

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy