610 likes | 631 Views
Precipitation Validation. Hydrology Training Workshop University of Hamburg. Chris Kidd …and many others…. Overview. Precipitation characteristics. Surface measurements: Gauges, Radar. Validation: Case study: the European IPWG site. Experiences – other analysis.
E N D
Precipitation Validation Hydrology Training Workshop University of Hamburg Chris Kidd …and many others…
Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Experiences – other analysis Results – statistical dependency Conclusions Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
2008 floods 2009 floods Why? – essentially to improve estimates
Precipitation Characteristics • The ‘modal’ instantaneous precipitation value is zero • Rain intensities are skewed towards zero: at middle to high latitudes, heavily so! • Spatial/temporal accumulations will ‘normalise’ the data • 1 mm of rain ≡ 1 lm-2 or 1 Kg (or 1000 tkm-2) Occurrence Accumulation Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar 0.2 mm/tip ARG100 gauge 0.1mm/tip Young’s Gauge
Conventional measurements Gauge data (rain/snow) • Simple measurements of accumulations • Quantitative sampling (tipping bucket gauges etc) • But, point measurements, under-catch errors, etc. Radar systems • Backscatter from hydrometeors (rain/snow/hail) • Spatial measurements • Potential to discriminate between precipitation type • But, range effects, anomalous propagation errors, Z-R relationships… Precipitation is highly variable both temporally and spatially: measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
20,000 Rain gauges Radarduplicates rain-gauge coverage Conventional Observations Precipitation is highly variable both temporally and spatially. Measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Variance explained by nearest station Jürgen Grieser Variance based upon monthly data: shorter periods = lower explained variance Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
What is truth? Co-located 8 gauges / 4 MRRs Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
1st gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
2nd gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
2 more gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
All gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
plus the MRR… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite 1:1 10 June 2009 : 40mm in 30mins MRR 24.1GHz Gauge, TBR Tipping bucket gauges provide quantised measurements (0.1 or 0.2 mm/tip) MRR critical for light rainfall Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Clee-Hill ATCRadar and C-band University of Helsinki C-band ChilboltonC-Band Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
National network Radars: Doppler, dual polarised 100/210km
Radar vs gauge data Radar (daily integrated) Gauge data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Helsinki Testbed • FMI Helsinki • Cold season – surface issues & mixed-phase precipitation to surface • Circles: 4 operational Doppler weather radars (FMI & EMHI), 1 Dual pol radar + 1 vertically pointing C-band radar for research (Vaisala & UH) • 2 vertically pointing POSS-radars • Dots: 80 gauges • Big diamonds: FD12P optical scatterometers • Triangles: ultrasonic snow depth • Squares: weighing gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Ground validation - IPWG synergies GV=Ground Validation After Turk & Arkin, BAMS 2008 Both approaches are complementary Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Summary: surface measurements Representativeness of surface measurements: • Over land generally good, but variable • Over oceans: virtually none-existent Measurement issues: • Physical collection – interferes with measurement (e.g. wind effects – frozen precip, etc) • Radar – imprecise backscatter:rainfall relationship (also clutter, range effects, bright band, etc) Satellites offer consistent, regular measurements, global coverage, real-time delivery of data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Observation availability * Resolutions vary greatly with scan angle, frequency, sensor etc) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Satellite observational scales s Observations made nominally at 1km/15 mins: Estimates possible at 1km/1min but inaccurate Precipitation products generally available at 0.25 degree daily, or 0.25 degree, 3 hourly 25km 1km Earth Resources Satellites Precipitation systems Vis/IR MW 3 hours LEO Vis IR 15 minutes Accuracy of satellite precipitation estimates improve with temporal/spatial averaging GEO rapid scan Ikonos Spot Landsat MODIS mm Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Low-Earth Orbit sensors SSM/I and TRMM Geostationary sensors Meteosat / MSG LEO vs GEO satellite observations
Resolutions time/space Data inputs Climatology Agriculture/crops Meteorology Hydrology Visible Infrared Passive MW Active MW Monthly/seasonal Climate resolution Instantaneous Full resolution Model outputs Observations to Products O b s e r v a t i o n s R e t r i e v a l s P r o d u c t s Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Global precipitation data sets Many different products at different spatial/temporal resolutions … and formats! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
IPWG European validation • Radar used as 'ground truth' • Composite of radars over UK, France, Germany, Belgium and Netherlands • Nominal 5 km resolution • Equal-area polar-stereographic projection • Data and product ingest • Near real-time • Statistical and graphical output (SGI/Irix; f77/netpbm) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing setup Perceived requirements: • Daily inter-comparison→ 00Z-24Z (also -06, -09, -12Z) • 0.25 degree resolution→ 25 km resolution • Real-time→ near real-time dependent upon product • Validation data → radar data (gauge being added later) • Automatic → quasi-automatic (not ‘operational’) • Many products → limited number of products Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing Schedule 01Z Global IR 02Z SSM/I data GPI FDA ECMWF 03Z European radar data PMIR 04Z 3B4x 05Z cics data Statistics at 20km EUMETSAT MPE Web pages 22Z Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Processing system Initial setup: Setting of dates Cleaning out old/decayed data Remapping of data: … to regional grid or 5 km PSG projection… Results generation: Statistical analysis Graphical output Acquiring data: Searching existing data Listing missing data Creation of .netrc file ftp data sources Web pages: Generate HTML files Copying to server Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
foreach day (d0-d0-31) foreach product & day dn=dn+1 remap to PSG using LUTs & standardise format set d0=today standardise filename foreach datasource (s0-sn) foreach product & day foreach product (p1-pn) Generate statistics foreach day (d0-d0-31) Generate plots N if (product for day) !exist Y foreach product & day add to .netrc file generate HTML files ftp datasource (4k) Processing checks Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
foreach day (d0-d0-31) foreach product & day dn=dn+1 remap to PSG using LUTs & standardise format set d0=today standardise filename foreach datasource (s0-sn) foreach product & day foreach product (p1-pn) Generate statistics foreach day (d0-d0-31) Generate plots if (product for day) !exist Y N foreach product & day add to .netrc file generate HTML files ftp datasource (4k) Processing checks Set up list of past dates/days Usually okay: sometimes needs tweaking Prepares products into common format Usually okay… Checks for a products results: Okay if no results, but not if bad data Generates outputs: Okay if there is rain… Generates raw HTML: Occasional issues with server FTP runs several times: 4K buffer limit on macros Automated systems they are NOT!
Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Validation data Precipitation product Occurrence comparison Accumulation comparison “Standard” Layout Contingency tables PoD/FAR/ HSS Scatter-plot Descriptive Statistics Cumulative distribution Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
PMIR results: Europe 2009-01-11 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
PMIR results: Australia 2008-12-25 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Results: Snow problems Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Results: rain extent Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
IPWG Inter-comparison regions Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG – International Precipitation Working Group (WMO/CGMS) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Monthly and seasonal validation Monthly and seasonal diagnostic validation summaries Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Month 5-day day 3-hour Validation resolution At full resolution the correlation of estimated rain is low; averaging over time and space improves the picture Fine-scale data is generated so users get to decide on averaging strategy VAR vs. HQ (mm/hr) Feb. 2002 30°N-S Huffman 2/10
Resolution vs Statistical Performance Performance can be improved just by smoothing the data! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Bacchiglione (1200 km2) Posina (116 km2) PMIR: 4km/30min 3B42RT: 1deg/3hr High:57.9 2 km 8 km 0.5 km 16 km 1 km 4 km Low:1.6 Validation through Hydrology Anagnostou & Hossain: Applications are resolution critical Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Instantaneous analysis • AMSR precipitation product (v10) • instantaneous radar (3x5 scans averaged to 15 mins). • 5km resolution average to 50x50km • Regions of interest: - NSea, Atlantic, France, Germany, UK. • January 2005 - September 2009 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010
Mean rainfall (mm/d) 2005-2009 Radar AMSR 0 0.1 0.3 0.5 1 2 4 8 Rainfall mm/d Hydrology Training Workshop: University of Hamburg, 12-14 October 2010