Timothy DelSole

and 2 more

Climate models initialized near the observed state tend to drift toward their own climatology. This drift typically is removed during post-processing by subtracting a lead-time and start-month dependent climatology estimated from a recent 30-year period. Unfortunately, this method cannot correct long-term trend errors. This paper proposes an alternative approach that corrects both mean and trend errors as well as several other types of errors. The core idea is to fit observations and forecasts to separate forced autoregressive models, called ARX models, and then use the ARX models to predict the forecast error, which may then be removed. This approach is illustrated with climate forecasts from the SPEAR model, a contributor to the North American Multi-Model Ensemble (NMME). The proposed method is shown to outperform traditional corrections. The superior performance is due to the fact that SPEAR has non-stationary errors in the form of trend and initialization errors that cannot be corrected by the traditional method. Comparison of the SPEAR and observation ARX models provides a novel process-oriented diagnostic and indicates that SPEAR’s trend errors are due to an exaggerated response to radiative forcing. Because SPEAR is used to generate initial conditions via an ensemble data assimilation system, its trend errors propagate through the data assimilation system to create spurious trends in the initial conditions. Indeed, a significant trend error exists in the first month, and these errors can be replicated with a one-dimensional data assimilation system in which the first guess comes from an ARX model that emulates SPEAR.

Chia-Ying Lee

and 6 more

This manuscript discusses the challenges in detecting and attributing recently observed trends in the Atlantic hurricanes and the epistemic uncertainty we face in assessing future hurricane risk. Data used here include synthetic storms downscaled from five CMIP5 models by the Columbia HAZard model (CHAZ), and directly simulated storms from high-resolution climate models. We examine three aspects of recent hurricane activity: the upward trend and multi-decadal oscillation of the annual frequency, the increase in storm wind intensity, and the downward trend in the forward speed. Some datasets suggest that these trends and oscillation are forced while others suggest that they can be explained by natural variability. Future projections under warming climate scenarios also show a wide range of possibilities, especially for the annual frequencies, which increase or decrease depending on the choice of moisture variable used in the CHAZ model and on the choice of climate model. The uncertainties in the annual frequency lead to epistemic uncertainties in the future hurricane risk assessment. Here, we investigate the reduction of epistemic uncertainties on annual frequency through a statistical practice – likelihood analysis. We find that historical observations are more consistent with the simulations with increasing frequency but we are not able to rule out other possibilities. We argue that the most rational way to treat epistemic uncertainty is to consider all outcomes contained in the results. In the context of hurricane risk assessment, since the results contain possible outcomes in which hurricane risk is increasing, this view implies that the risk is increasing.