The Statistical DownScaling Model: insights from one decade of application


R. L. Wilby, Department of Geography, Loughborough University, Loughborough, Leicestershire LE11 3TU, UK. E-mail:


The Statistical DownScaling Model (SDSM) is a freely available tool that produces high resolution climate change scenarios. The first public version of the software was released in 2001 and since then there have been over 170 documented studies worldwide. This article recounts the underlining conceptual and technical evolution of SDSM, drawing upon independent assessments of model capabilities. These studies show that SDSM yields reliable estimates of extreme temperatures, seasonal precipitation totals, areal and inter-site precipitation behaviour. Frequency estimation of extreme precipitation amounts in dry seasons is less reliable. A meta-analysis of SDSM outputs shows a preponderance of research in Canada, China and the UK, whereas the United States and Australasia are under-represented. In line with the wider downscaling community, the most favoured sector of analysis is water and flood risk management which accounts for nearly half of all output; research in other sectors such as agriculture, built environment and human health is less prominent but growing. Over 50% of the studies are concerned with production of climate scenarios, comparison or technical refinement of downscaling methodologies. In contrast, there is relatively little evidence of application to adaptation planning and climate risk management. We assert that further attention to physically meaningful quantities such as wind speeds, wave heights, phenological and hazard metrics could improve uptake of downscaled products. Chronic uncertainty in boundary forcing continues to undermine confidence in downscaled scenarios so these tools are best used for sensitivity testing and adaptation options appraisal. Copyright © 2012 Royal Meteorological Society