PERFORMANCE METRICS
- Develop milestones for new / improved warm season precipitation forecasts
- Develop new performance metrics tied to these forecasts:
- Develop “quantitative” future performance goals
- Explicitly tie these to performance measures in operational meteorology
(e.g. NWS HPC and CPC).
- QPF (day 1, day 2, day 3)
- 6-10 day forecast skill (Heidke)
- monthly and seasonal forecast skill
- NAME SWG needs to develop a strategic plan for this. Workshop needed?
Notes:
- Message
- We are demonstrating skill for seasonal temperature
- Trends have been influenced by climate patterns (2001 and 2002 weak signals compared to 1997-1998)
- There is room for improvement with added supercomputer capacity to run ensembles and coupled models; we anticipate improved skill in this area
- FY01 Goal: 20; FY01 Actual: 20
- Key Data Points
- Measure compares actual observed temperatures with forecasted temperatures from areas around the country
- There are approximately 100 forecast points across the country
- Verification is done at points where forecast is for other than climatology
- Only use forecast points where there is not an equal chance of the temperature being normal, above, or below (i.e. where the seasonal forecast has predicted above or below normal temperatures)
- This score measures how much better the predictions are than the random forecasts
- The score for a random forecaster is zero
- Skill score of 20 is considered good; this means the forecast was correct in almost 50% of the locations forecasted
- Expanded computing capacity on the new NWS supercomputer