Measure of Forecast Challenge and Predictability Horizon Diagram Index for Ensemble ModelsSource: Weather and Forecasting:;2019:;volume 034:;issue 003::page 603DOI: 10.1175/WAF-D-18-0114.1Publisher: American Meteorological Society
Abstract: AbstractResponding to the call for new verification methods in a recent editorial in Weather and Forecasting, this study proposed two new verification metrics to quantify the forecast challenges that a user faces in decision-making when using ensemble models. The measure of forecast challenge (MFC) combines forecast error and uncertainty information together into one single score. It consists of four elements: ensemble mean error, spread, nonlinearity, and outliers. The cross correlation among the four elements indicates that each element contains independent information. The relative contribution of each element to the MFC is analyzed by calculating the correlation between each element and MFC. The biggest contributor is the ensemble mean error, followed by the ensemble spread, nonlinearity, and outliers. By applying MFC to the predictability horizon diagram of a forecast ensemble, a predictability horizon diagram index (PHDX) is defined to quantify how the ensemble evolves at a specific location as an event approaches. The value of PHDX varies between 1.0 and ?1.0. A positive PHDX indicates that the forecast challenge decreases as an event nears (type I), providing creditable forecast information to users. A negative PHDX value indicates that the forecast challenge increases as an event nears (type II), providing misleading information to users. A near-zero PHDX value indicates that the forecast challenge remains large as an event nears, providing largely uncertain information to users. Unlike current verification metrics that verify at a particular point in time, PHDX verifies a forecasting process through many forecasting cycles. Forecasting-process-oriented verification could be a new direction in model verification. The sample ensemble forecasts used in this study are produced from the NCEP global and regional ensembles.
|
Collections
Show full item record
| contributor author | Du, Jun | |
| contributor author | Zhou, Binbin | |
| contributor author | Levit, Jason | |
| date accessioned | 2019-10-05T06:44:30Z | |
| date available | 2019-10-05T06:44:30Z | |
| date copyright | 1/21/2019 12:00:00 AM | |
| date issued | 2019 | |
| identifier other | WAF-D-18-0114.1.pdf | |
| identifier uri | http://yetl.yabesh.ir/yetl1/handle/yetl/4263278 | |
| description abstract | AbstractResponding to the call for new verification methods in a recent editorial in Weather and Forecasting, this study proposed two new verification metrics to quantify the forecast challenges that a user faces in decision-making when using ensemble models. The measure of forecast challenge (MFC) combines forecast error and uncertainty information together into one single score. It consists of four elements: ensemble mean error, spread, nonlinearity, and outliers. The cross correlation among the four elements indicates that each element contains independent information. The relative contribution of each element to the MFC is analyzed by calculating the correlation between each element and MFC. The biggest contributor is the ensemble mean error, followed by the ensemble spread, nonlinearity, and outliers. By applying MFC to the predictability horizon diagram of a forecast ensemble, a predictability horizon diagram index (PHDX) is defined to quantify how the ensemble evolves at a specific location as an event approaches. The value of PHDX varies between 1.0 and ?1.0. A positive PHDX indicates that the forecast challenge decreases as an event nears (type I), providing creditable forecast information to users. A negative PHDX value indicates that the forecast challenge increases as an event nears (type II), providing misleading information to users. A near-zero PHDX value indicates that the forecast challenge remains large as an event nears, providing largely uncertain information to users. Unlike current verification metrics that verify at a particular point in time, PHDX verifies a forecasting process through many forecasting cycles. Forecasting-process-oriented verification could be a new direction in model verification. The sample ensemble forecasts used in this study are produced from the NCEP global and regional ensembles. | |
| publisher | American Meteorological Society | |
| title | Measure of Forecast Challenge and Predictability Horizon Diagram Index for Ensemble Models | |
| type | Journal Paper | |
| journal volume | 34 | |
| journal issue | 3 | |
| journal title | Weather and Forecasting | |
| identifier doi | 10.1175/WAF-D-18-0114.1 | |
| journal fristpage | 603 | |
| journal lastpage | 615 | |
| tree | Weather and Forecasting:;2019:;volume 034:;issue 003 | |
| contenttype | Fulltext |