260 likes | 375 Views
Object-based Spatial Verification for Multiple Purposes. www.cawcr.gov.au. Beth Ebert 1 , Lawrie Rikus 1 , Aurel Moise 1 , Jun Chen 1,2 , and Raghavendra Ashrit 3 1 CAWCR, Melbourne, Australia 2 University of Melbourne, Australia 3 NCMRWF, India. Object-based spatial verification.
E N D
Object-based Spatial Verification for Multiple Purposes www.cawcr.gov.au Beth Ebert1, Lawrie Rikus1, Aurel Moise1, Jun Chen1,2, and Raghavendra Ashrit31 CAWCR, Melbourne, Australia2 University of Melbourne, Australia3 NCMRWF, India
Object-based spatial verification OBSERVATIONS FORECAST
Other examples Vertical cloud comparison HIRLAM cloud AVHRR satellite Climate features (SPCZ) Jets in vertical plane Convective initiation
What does an object approach tell us? • Errors in • Location • Size • Intensity • Orientation • Results can • Characterize errors for individual forecasts • Show systematic errors • Give hints as to source(s) of errors • I will discuss CRA, MODE, "Blob" • Not SAL, Procrustes, Composite (Nachamkin), others FCST OBS
Define a rectangular search box around CRA to look for best match between forecast and observations • Displacement determined by shifting forecast within the box until MSE is minimized or correlation coefficient is maximized • Error decomposition MSEtotal = MSEdisplacement + MSEintensity+ MSEpattern Contiguous Rain Area (CRA) verification • Find Contiguous Rain Areas (CRA) in the fields to be verified • Choose threshold • Take union of forecast and observations • Use minimum number of points and/or total volume of parameter to filter out insignificant CRAs Observed Forecast Ebert & McBride, J. Hydrol., 2000
Heavy rain over India Met Office global NWP model forecasts for monsoon rainfall, 2007-2012 Ashrit et al., WAF, in revision
Heavy rain over India Errors in Day 1 rainfall forecasts 10 mm/d 20 mm/d 40 mm/d CRA threshold: 10 mm/d 20 mm/d 40 mm/d
Heavy rain over India Error decomposition (%) of Day 1 rainfall forecasts
Climate model evaluation Can global climate models reproduce features such as the South Pacific Convergence Zone? Delage and Moise, JGR, 2011 added a rotation component
Climate model evaluation "Location error" = MSEdisplacement + MSErotation "Shape error" = MSEvolume + MSEpattern Applied to 26 CMIP3 models etc.
Climate model evaluation Correcting the position of ENSO EOF1 strengthens model agreement on projected changes in spatial patterns of ENSO driven variability in temperature and precipitation Power et al., Nature, 2013
Method for Object-based DiagnosticEvaluation (MODE) (Davis et al. MWR 2006) Convolution – threshold process Identification Fuzzy Logic Approach Compare forecast and observed attributes Merge single objects into clusters Compute interest values* Identify matched pairs Measure attributes Merging Matching Comparison Accumulate and examine comparisons across many cases Summarize *interest value = weighted combination of attribute matching
Comparison for tropical cyclone rainfall CRA MODE Chen, Ebert, Brown (2014) – work in progress
Westerly jets "Blob" defined by percentile of local maximum of zonal mean U in reanalysis Y-Z plane 5th percentile 10th percentile 15th percentile Rikus, Clim. Dyn., submitted
Westerly jets Global reanalyses show consistent behaviour except 20CR. Can be used to evaluate global climate models.
Future of object-based verification • Routinely applied in operational verification suite • Other variables • Climate applications
Future of object-based verification Ensemble prediction – match individual ensemble members 8 ensemble members Prob(object)=7/8 Ensemble calibration approaches Brier skill score Johnson & Wang, MWR, 2012, 2013
Future of object-based verification Weather hazards Tropical cyclone structure Fire spread Pollution cloud, heat anomaly Flood inundation WWRP High Impact Weather Project Blizzard extent and intensity
Thank you The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Thank you www.cawcr.gov.au
Spatial Verification Intercomparison Project Tier 3 Tier 2a Determ. precip + VERA ensemble + JDC obs Determ. wind + VERA ensemble + JDC obs Tier 1 Determ. wind + VERA anal + JDC obs Ensemble precip + VERA anal + JDC obs Core Determ. precip + VERA anal + JDC obs Other variables ensemble + VERA ensemble + JDC obs Sensitivity tests to method parameters Ensemble wind + VERA anal + JDC obs Ensemble wind + VERA ensemble + JDC obs Ensemble precip + VERA ensemble + JDC obs Tier 2b • Phase 1 – understanding the methods • Phase 2 – testing the methods • "MesoVICT" – precipitation and rain in complex terrain • Deterministic & ensemble forecasts • Point and gridded observations including ensemble observations • MAP D-PHASE / COPS dataset
MODE – total interest • Attributes: • centroid distance separation • minimum separation distance of object boundaries • orientation angle difference • area ratio • intersection area M = number of attributes Fi,j= value of object match (0-1) ci,j= confidence, how well a given attribute describes the forecast error wi,j= weight given to an attribute
Tropical cyclone rainfall CRA: Displacement & rotation error Correlation coefficient Volume Median, extreme rain Rain area Error decomposition MODE: • Centroid distance & angle difference • Total interest • Volume • Median, extreme rain • Intersection / union / symmetric area