Development of probabilistic climate predictions for UKCIP08 David Sexton, James Murphy, Mat Collins, Geoff Jenkins , Glen Harris, Kate Brown , Robin Clark,
Download ReportTranscript Development of probabilistic climate predictions for UKCIP08 David Sexton, James Murphy, Mat Collins, Geoff Jenkins , Glen Harris, Kate Brown , Robin Clark,
Development of probabilistic climate predictions for UKCIP08 David Sexton, James Murphy, Mat Collins, Geoff Jenkins , Glen Harris, Kate Brown , Robin Clark, Penny Boorman, Simon Brown, Richard Jones, Jason Lowe, Ben Booth, B. Bhaskaran, David Hassell, Ruth McDonald, Tom Howard, Lizzie Kennett UEA, October 19, 2007 © Crown copyright 2004 Page 1 Content UKCIP08 Probabilistic climate prediction system Modelling uncertainty and perturbed physics ensembles Weighting with observations Time Scaling Other components of Earth System Downscaling Assumptions © Crown copyright 2004 Page 2 UKCIP ‘02 Based on the state-of-the-art at the time - HadCM3, HadAM3H time-slice, 50km HadRM3 experiments Used by many private and public-sector organisations to make decisions and spend money “Scenario” based with no quantification of uncertainties (although plenty of caveats pointing this out) © Crown copyright 2004 Page 3 Uncertainties in model projections Effects of internal variability Modelling of Earth system processes © Crown copyright 2004 Emission scenarios … which includes how informative are models about reality Page 4 Modelling uncertainty Set of international climate models are all ‘tuned’ to observations But there is no guarantee these are the actual optimal models Other choices of values for model input parameters could have provided equally plausible simulations of observations whilst providing a wide range of responses in the future So tuning could affect the decisions planners make based on climate predictions © Crown copyright 2004 Page 5 UKCIP08 – Probabilistic predictions To provide joint probability distribution functions (pdfs) of predicted changes in a selection of key UK climate variables at 25km resolution for 2010-2039, 2020-2049,…,20702099 Results will be presented for each variable by month We aim to deliver the final report and the pdfs October 2008 © Crown copyright 2004 Page 6 UKCIP08 Products Report Three types of output: Probabilistic PDF Weather Generator (change factors from PDFs) Raw daily data from 17 regional climate models Web-based data delivery package (UI) Will produce nice graphics Provide some analysis Provide some guidance Documentation on guidance Preparatory workshops © Crown copyright 2004 Page 7 Probabilistic climate predictions are … It is not a probability distribution from which the real world samples what it does So not an ensemble weather forecast for the future. It is just a representation of the degree to which each possible future climate is plausible given the evidence (climate models and observations). As the evidence changes so will the prediction. Underlying value is to reduce the risk of a user making a bad decision So instead of giving a policy maker all our modelled and observed data we give them a summary statement of the extent to which various possible future climates are consistent with the evidence. © Crown copyright 2004 Page 8 Production of UKCIP08 predictions No computer in world is big enough to run many variants of a 25km Earth system model so we have developed a framework to combine lots of pieces (Murphy et al, Phil. Trans. Royal Society, 2007). Aerosol PPE Carbon cycle PPE EBM Time-scaling Downscaling Perturbed physics ensemble Ocean PPE © Crown copyright 2004 Page 9 Perturbed physics ensembles © Crown copyright 2004 Page 10 ..use “perturbed physics ensembles” to sample systematically a space of possible model configurations • Relatively large ensembles designed to sample modelling uncertainties systematically within a single model framework • Executed by perturbing model input parameters controlling key model processes, within expert-specified ranges • Key strength: Allows greater control over experimental design cf multi-model “ensembles of opportunity” • Key limitation: does not sample “structural modelling uncertainties”, e.g. changes in resolution, or in the fundamental assumptions used in the model’s parameterisation schemes – need to include results from other models to account for these. © Crown copyright 2004 Page 11 First steps • Take one climate model (in this case version 3 of the Hadley Centre model) • Specify distributions for multiple uncertain model parameters controlling atmospheric physical processes • Run an ensemble of simulations (@300km horizontal resolution) of the equilibrium response to doubled CO2 © Crown copyright 2004 Page 12 ..gives a large (~300 member) sample of possible changes (e.g. summer UK rainfall) © Crown copyright 2004 Page 13 Making probabilistic climate predictions for 2xCO2 response © Crown copyright 2004 Page 14 Bayesian prediction – Goldstein and Rougier Aim is to construct joint probability distribution p(X, mh , mf ,y,o,d) of all uncertain objects in problem. Input parameters (X) Historical Model output (mh) Model prediction (mf) True climate (yh,yf) Observations (o) Model imperfections (d) It measures how all objects are related in a probabilistic sense © Crown copyright 2004 Page 15 Best-input assumption Physical and dynamical processes in a climate model are controlled by numbers called model input parameters. We assume that one choice of these values, x*, is better than all others y True climate © Crown copyright 2004 f ( x*) Model output of best choice of parameter values x* Discrepancy Page 16 Best-input assumption We only know the probability that any combination of parameter values is the bestinput model. But that means we need millions of model variants. That is too expensive - can only afford hundreds of runs but they have to sampled in a way that is consistent with your beliefs about where the best model is. Need a cheap alternative.. © Crown copyright 2004 Page 17 Emulators e.g. climate sensitivity Sqrt(climate sensitivity) Emulators are statistical models, trained on ensemble runs, designed to predict model output at untried parameter combinations Dots – actual runs Lines – 95% credible interval from emulator © Crown copyright 2004 Ensemble member Page 18 Sampling different model variants with emulator © Crown copyright 2004 Page 19 Climate sensitivity – before weighting with observations The Prior © Crown copyright 2004 FOCUS ON BLACK CURVE Page 20 Parameter Constraints due to weighting © Crown copyright 2004 Page 21 Weighting different model variants © Crown copyright 2004 Page 22 Weighting different model variants © Crown copyright 2004 Page 23 Climate sensitivity “Truncation level” = amount of independent information from observations The Posterior © Crown copyright 2004 FOCUS ON RED CURVE Page 24 Climate sensitivity “Truncation level” = amount of independent information from observations FOCUS ON RED CURVE © Crown copyright 2004 Page 25 Weighting models with observations and discrepancy © Crown copyright 2004 Page 26 Physics/dynamics matter… Compare models against several observational variables – with just one variable you can simulate climate well for the wrong reasons Will compare with present-day mean climate Indirect assessment of key processes for our climate prediction but adds confidence to our prediction of one-off event We are not going to assume models are perfect so using better models has an impact © Crown copyright 2004 Page 27 Best-input assumption Physical and dynamical processes in a climate model are controlled by numbers called model input parameters. We assume that one choice of these values, x*, is better than all others y True climate © Crown copyright 2004 f ( x*) Model output of best choice of parameter values x* Discrepancy Page 28 Comparing models with observations Use likelihood function i.e. skill of model is likelihood of model data given some observations n 1 log Lo (m) c log | V | (m - o)T V 1 (m - o) 2 2 V = obs uncertainty + emulator error + discrepancy Discrepancy is ‘distance’ between real system and ‘best’ choice of input parameters Truncation level = dimensionality of m, o © Crown copyright 2004 Page 29 Discrepancy – a schematic of what it does • Avoids observations over-constraining the pdfs. • Avoids contradictions from subsequent analyses when some observations have been allowed to constrain the problem too strongly. © Crown copyright 2004 Page 30 Specifying discrepancy Use multimodel ensemble from AR4 and CFMIP For each multimodel ensemble member, find emulated model variant that is closest to that member There is a distance between climates of this multimodel ensemble member and this “best” emulated model variant i.e. effect of processes not explored by slab model variants. Pool these distances over all multimodel ensemble members © Crown copyright 2004 Page 31 Four types of data… © Crown copyright 2004 Page 32 Errors in predicting multimodel ensemble •Each dot is a member of multimodel ensemble •Grey shading represents 95% confidence interval from internal climate variability A choice: select 10 as this is as large as possible whilst still providing a robust estimate Number of observable quantities in cost function used to find ‘best input’ © Crown copyright 2004 Page 33 Climate sensitivity “Truncation level” = amount of independent information from observations FOCUS ON RED CURVE © Crown copyright 2004 Page 34 Joint probabilities © Crown copyright 2004 Page 35 Time scaling © Crown copyright 2004 Page 36 Production of UKCIPnext predictions For A1B, B1, A1FI scenarios… Aerosol PPE Carbon cycle PPE EBM Time-scaling Downscaling Equilibrium PPE Ocean PPE © Crown copyright 2004 Page 37 Coupled Atmosphere-Ocean Ensembles Smaller ensembles of HadCM3 because of spinup issues Perturbations to atmospheremodel parameters with equivalent HadSM3 versions Flux adjustments used to keep models stable and reduce SST biases © Crown copyright 2004 Historical + A1B forcing Observations Collins et al. 2006 Page 38 Pattern Scaling to Produce Pseudo-Transient Ensembles - Methodology © Crown copyright 2004 Page 39 Some plumes…Wales August temperature No carbon cycle feedback yet © Crown copyright 2004 Page 40 Other components of Earth System © Crown copyright 2004 Page 41 Production of UKCIPnext predictions For A1B, B1, A1FI scenarios… Aerosol PPE Carbon cycle PPE EBM Time-scaling Downscaling Equilibrium PPE Ocean PPE © Crown copyright 2004 Page 42 Uncertainties in the transient response of global mean surface temperature Ocean parameter perturbation experiments (17 member ensemble) run to quantify effects of uncertainties in ocean transport processes Atmosphere parameters perturbed Sulphur cycle parameter perturbation experiments (another 17 member ensemble) also run Ocean parameters perturbed © Crown copyright 2004 Sulphur Cycle parameters perturbed Page 43 Impact of terrestrial uncertainties on CO2 Total atmospheric CO2 concentration Standard HadCM3, 16 variants of terrestrial carbon cycle © Crown copyright 2004 Black crosses - observations Page 44 Downscaling © Crown copyright 2004 Page 45 Production of UKCIPnext predictions Aerosol PPE Carbon cycle PPE EBM Time-scaling Downscaling Equilibrium PPE Ocean PPE © Crown copyright 2004 Page 46 Downscaling • Have also run a 17-member 25km resolution ensemble of perturbed physics regional model versions. • Driven by boundary forcing from the HadCM3 A1B transient simulations (1950-2100). • We will construct regression relationships between the 17 GCM and 17 RCM simulations of future climate. • Use these to create regional response pdfs at 25km scale. Will add further uncertainty to the regional responses. © Crown copyright 2004 Page 47 Downscaling uncertainty 16 realisations of the difference in response of the regional model relative to its driving global model, for January precipitation (% change for 2071-00 relative to 1950-79). © Crown copyright 2004 Page 48 Downscaling relationships… RCM GCM error © Crown copyright 2004 Page 49 Assumptions © Crown copyright 2004 Page 50 What are the main assumptions we cannot test Local feedbacks between atmosphere and other components of Earth System (carbon cycle, aerosol chemistry and ocean) are of second order importance to effects linked to global temperature change. Structural model uncertainty is a good proxy for difference between HadCM3 family of models and real system Pattern scaling, downscaling relationships applicable across parameter space Multimodel members have equal contribution to discrepancy © Crown copyright 2004 Page 51 THE END ANY QUESTIONS? © Crown copyright 2004 Page 52 UKCIPnext (Hadley Centre contribution) – Aims and Objectives To provide joint probability distribution functions (pdfs) of predicted changes in a selection of key UK climate variables at 25km resolution for each decade during the 21st century Results will be presented for each variable by month indicating mainly mean outcomes but also extremes for e.g. max/min temperature, precipitation We aim to deliver the pdfs and final report summer 2008 © Crown copyright 2004 Page 53 Sensitivity to prior – climate sensitivity Before observational constraint © Crown copyright 2004 After observational constraint Page 54 Sensitivity to prior - %ΔUK summer rainfall Before observational constraint © Crown copyright 2004 After observational constraint Page 55 Monte Carlo Sampling Monte Carlo iteration 1 2 3 4 5 © Crown copyright 2004 Emulated Distributions Emulated Samples Sampled Value -0.4 0.3 -0.1 0.9 -0.2 Page 56 Reducing uncertainty Improve observational uncertainties Improve model i.e. reduce discrepancy Run larger ensembles Use more observational constraints independent of the ones used already Remove pattern scaling and downscaling steps Remove assumptions about linking submodules © Crown copyright 2004 Page 57 Weather Generators We will make probabilistic predictions for the variables that are inputted into the weather generator Weather Generators will be used to generate time series consistent with probabilistic predictions If need spatially coherent time series at high temporal and spatial resolution, can use output from 17 regional climate model runs © Crown copyright 2004 Page 58 Ideal for future UKCIPs Run 1860-2120 with fully coupled Earth System Models perturbing parameters in all components simultaneously and then downscale That is, no equilibrium runs, no ensembles on individual components Would need other climate centres to run this experiment for their standard model and ideally they would have these downscaled. © Crown copyright 2004 Page 59 Response surface predicted by emulator © Crown copyright 2004 Climate Sensitivity as a function of two parameters according to mean prediction of the emulator – note emulator also predicts uncertainty of response surface Page 60 Summer UK % precipitation change FOCUS ON RED CURVE Another choice: what truncation level to choose… © Crown copyright 2004 Page 61 Probabilistic climate prediction Probabilistic prediction is a function of Model Observations Choices Assumptions Choices guided by principle that we think it is important to model the Earth System correctly. © Crown copyright 2004 Page 62 Bayesian framework by Goldstein and Rougier: some terms “emulated” prior distribution posterior distribution histogram of “perturbed physics” ensemble © Crown copyright 2004 Murphy et al., 2004, Nature, 430, 768-772 Page 63 Ensemble Simulations “Bedrock” provided by a relatively large ~300 member ensemble of HadSM3 (atmosphere-slab ocean) run at 1x and 2xCO2 Results sensitive to how you select parameter combinations © Crown copyright 2004 Murphy et al., 2004 Webb et al., submitted Stainforth et al., 2005 Page 64 Weights As truncation level increases, have to be luckier to land on a quality point in parameter space © Crown copyright 2004 Page 65 Precision of percentile estimates Precision of 95th percentile estimate CHOOSE THIS ONE! © Crown copyright 2004 Number of Monte Carlo samples 1-0.5 million Page 66 Emulators Emulators are statistical models, trained on ensemble runs, designed to predict model output at untried parameter combinations © Crown copyright 2004 Page 67 Emulators and priors Monte Carlo sampling of parameters combined with an emulator overcomes dependency on sampling strategy to produce prior prediction (blue line) consistent with beliefs about where the best input lies. Prior distribution – prediction before any observations used © Crown copyright 2004 Page 68 Discrepancy on future variable Model not perfect so there are processes in real system but not in our model that could alter model response by an uncertain amount. Places extra uncertainty on prediction variable in form of a variance © Crown copyright 2004 Page 69 Constraining predictions Where is the ‘best’ input? Observations reduce uncertainty about which points are best in parameter space Most effective if a strong relationship exists © Crown copyright 2004 Page 70 Standard carbon cycle, 3 versions of atmosphere GCM Dashed – no carbon cycle Solid © Crown copyright 2004 – with carbon cycle Page 71 Estimating discrepancy Four ways I can think of… Elicitation Observations Super-parameterised models Ensemble of international climate models © Crown copyright 2004 Page 72