1 / 30

WP 5000 – Impact Assessment

WP 5000 – Impact Assessment. S. Labroue, T. Moreau, P. Thibaut (CLS) N. Picot, F. Boy (CNES). WP5000: Impact Assessment - Methodology. Round Robin exercise Collaborative context : In the consortium : iteration loops and review processes will be set up

casper
Download Presentation

WP 5000 – Impact Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WP 5000 – Impact Assessment S. Labroue, T. Moreau, P. Thibaut (CLS) N. Picot, F. Boy (CNES)

  2. WP5000: Impact Assessment - Methodology • Round Robin exercise • Collaborative context : • In the consortium : iterationloops and reviewprocesseswillbe set up • Outside the consortium : open to other data set evaluation (contributions from NOAA, TAS, ENSEEIHT, UCL, …) • CLS shoulddefine the metrics and the methodology for the qualityassessment • With the retracking experts (specialsensitivity in SAR processingwhich are not present in LRM processing) • Takingintoaccount the WP2000 conclusions/recommendations => Metrics and methodology has to beapproved by the consortium • Comparison of one retrackingtakinganother one as reference • Importance to all agree on the reference • Comparisonfocused on SLA and sea state (SWH and sigma0 when available in RDSAR) • Detection of differencesbetween the algorithms • Work on metrics able to quantify the relative advantages/drawbacks of each of them. • Feedbacks to the WP4000 retracking experts • Possiblyrecommend one algorithm? KOM

  3. WP5000: Impact Assessment - Methodology • Objectives of thisassessment for open ocean • Detectcorrelatederrors for scalesbeyond 150 km • Confirmthat the SAR processingallowsretrievingsmallest spatial scales (20-70 km) thanks to 20 Hz noise reduction in the alongtrack direction • The objective of WP5000 is to compare the algorithms on metricsagreed by everyone. • The metrics are not able to detecteverything => somesubtleprocessingdifferencesmight not bedetectable by the proposedmetrics (impact of a few mm difficult to separatefrom the oceanic signal) • The multiplication of differentmetricsis important KOM

  4. WP5000: Impact Assessment - Methodology • Eachprocessing has been validatedindividually by eachresponsible of WP4000 task • The objective of the WP5000 is to go further in the geophysical validation of thesealgorithms for eachsub-team with a focus on retrackingmethods (assessment of the geophysical correction is not foreseen) • Product Validation Reports are provided for eachalgorithm in WP4000 • Data sets are produced by WP4000 contributors on agreed areas and periods • Analysisperformedat 1 Hz and at 20 Hz depending on the objective => Data sets with 20 Hz samplingwouldbepreferred (compression at 1 Hz canbeperformed by the WP5000) KOM

  5. WP5000: Impact Assessment – Work Plan I N P U T S WP4000 contributors are consulted to check and agree the outputs • WP2000 recommandations • WP4400 data set • Data set user manual • WP4000 Product validation report • WP4000 ATBDs WP5000 Impact Assessment Round Robin exercise • CNES/CLS database • (L2 CPP SAR/RDSAR) Overall impact assessment report • CNES/CLS database • (other EO satellite data and geophysical corrections)

  6. WP5000: Impact Assessment - Work Plan • We propose the CPP/CNES retracking as a reference for the LRM becausewe have a large data set (sinceJanuary 2011) available and alreadyvalidated • Werecommend the use of L1 CPP data over SAR mode areas as inputs of SAR and RDSAR retrackings • to perform direct differencesbetweencollocated SAR and RDSAR measurements • because SAR waveforms are consistent in time withunchanged IPF data processing (no over-sampling in range and truncation, and no azimuthHammingweightingimplemented) • RDSAR retrackingsshouldbevalidatedcompared to the CPP LRM retracking • SAR retrackingsshouldbevalidatedcompared to RDSAR retracking to bechosen (in the most intelligent manner to evidence the best of eachalgorithm) • No SSB corrections appliedto SAR and LRM mode. This couldpotentiallyexplainsomeobserveddiscrepanciesbetween SAR and LRM SLA

  7. WP5000: Impact Assessment - Methodology • How to validate SAR processing? • Difficultbecauselimited areas and not a global coveragecompared to LRM mode • No overlapbetween LRM and SAR zones • We propose a 2 stepsapproach • SAR validated versus Reduced SAR = Relative Validation • The easiest one since the referenceiscollocated • Analysis of Cryosat-2 data alone • Reduced SAR validated versus LRM = Absolute Validation (3 means) • Analysis of Cryosat-2 data • Cross calibration with J2 • Analysis of the LRM/RDSAR transition KOM • This 2 stepsapproachcouldbecompleted by absolute validation between SAR and LRM (withoutusing the RDSAR comparison)

  8. WP5000: Impact Assessment – Absolute Validation • Analysis of Cryosat-2 data • Maps of rejected data • Spectral analysis of SLA, SWH • Analysis of the parametersrelated to the retracking (histograms, maps) • Analysis of the parametersrelated to the 20Hz =>1 Hz compression • Detection of dependencies • Focus on scalesgreaterthan 150 km • Sensitivity to radial velocity, SWH, mispointing, etc… • Performance at cross-overs and SLA • Exploit the change of geographicalmaskthatprovides a few crossoversbetween LRM and SAR mode (before and after the 7th May 2012) • Analysis of parameterswrt to coastal distance • Seafloormapping over the Pacific oceanregion operating in SAR mode: • To find new structures (using data where we have never got any ones) • To evaluate the quality of the data (Cryosat SAR noise level and its qualities at different wavelengths) by comparing SAR retracked data to mss or mean profiles. KOM

  9. WP5000: Impact Assessment – Absolute Validation • Cross calibration with J2 • Qualitative by comparing 2 alongtrack profiles by selectingparalleltracks • Quantitative withstatisticsatcrossovers • Analysis of the LRM/RDSAR transition • Important to insure RDSAR validation • Histograms of parameters in LRM mode at the transition location • Histograms of parameters in SAR mode at the transition location • Analysis of the difference RDSAR-LRM at the transition location KOM

  10. WP5000: Impact Assessment – Relative Validation • Analysis of differencesbetween 2 retrackings • Some of the metricsshouldcertainlybeanalysed by the WP4000 validation reports • Maps of gained or rejected data compared to the referencebased on ourexperience in data editing • Spectral analysis of differences of SLA, SWH (cancels out the physical signal) • Analysis of the parametersdifferences (histograms, maps) • Detection of dependencies in the difference • Focus on scalesgreaterthan 150 km • Sensitivity to radial velocity, SWH, mispointing, etc… KOM

  11. WP5000: Impact Assessment - Metrics • Problem of multi-degrees of freedom: • Data analyses will focus on the dependencieswrt radial velocity and off-nadir mispointing angle of the satellite (thatmay impact the estimates) • Need to separateascending and descendingtracksthat are related to different radial velocity and off-nadir angle values at the same point location. • Wemight have to considerall SAR mode areas to cover the largest range of values as possible for allowingrobuststatistical analyses and efficient assessment of the impact of each new products

  12. May 2012 May 2012

  13. August 2012 August 2012

  14. WP5000: Impact Assessment – Zones & Period • Equatorial Pacific • Acquisition since 7 May 2012 (and reduced area since 01/10) • Brown-like zone (few rain/blooms, SWH close to 2 m, lowoceanicvariability stable in time) • Med Sea • Wellknownregion • But withcalmseas (bloom events) • Coastalregion • North Atlantic • Seasonal variation (with bloom events in summer time) • High waves in winter time • AgulhasCurrent • Zone studied in PISTACH => J2 highresolution data sets available (20Hz) withoptimisedprocessing • High waves April 2012 August 2012

  15. WP5000: Impact Assessment – Zones & Period • Whatis the recommendedgeographicalcoverage: Global • to separate the errorslinked to radial velocity, mispointing, SWH • to covermost of the range of radial velocity, mispointing, SWH values February 2012 August 2012

  16. WP5000: Impact Assessment – Zones & Period • Whatis the recommended time : • At least 2 months time series (1 cycle in summer and 1 cycle in winter) to providesignificant variation of SWH and mispointing

  17. WP5000: Impact Assessment - Zones & Period • To assessthe quality of the sea level spectrum at all spatial scales, the spectral analysisshouldbe able: • To detect noise level @ high frequency • To identifycorrelatederrors for scalesbetween 10 and 80km • To check consistency of the oceanic signal @ highwavelength 1 day over SAR Pacific zone 1 cycle over SAR Pacific zone 3 cycles over SAR Pacific zone RDSAR CPP CS2 LRM J2 First valuableresult SAR CPP CS2

  18. WP5000: Impact Assessment - Synthesis • The differentmetricslistedwill help to answer to the 4 subthemes KOM

  19. WP5000: Impact Assessment - Synthesis • Inputs • WP1000 user requirementssynthesis • WP2000 synthesis • WP4000 validation reports • Deliverables • Validation report for eachalgorithm • Synthesis report • Risks • Not enough data to assesseachalgorithm (open ocean) • No clear conclusion coming out fromthisassessment KOM

  20. WP5000: Impact Assessment – Tools • Comparisonswillbeperformed in terms of: • Cartographies (to visualizegeographyicallycorrelatedmeanerror) • Histograms • Spectral analysis (allowing to identify the energy/errorlevelsatdifferent spatial wavelengths) • Time seriesanalysis • Depedenciesanalysis (correlationsbetweenparameters)

  21. SAMPLE TEST REPORT SAMPLE TEST REPORT

  22. SAMPLE TEST REPORT SAMPLE TEST REPORT

  23. SAMPLE TEST REPORT SAMPLE TEST REPORT

  24. SAMPLE TEST REPORT SAMPLE TEST REPORT

  25. SAMPLE TEST REPORT SAMPLE TEST REPORT

  26. SAMPLE TEST REPORT SAMPLE TEST REPORT

  27. SAMPLE TEST REPORT SAMPLE TEST REPORT

  28. SAMPLE TEST REPORT SAMPLE TEST REPORT

  29. SAMPLE TEST REPORT SAMPLE TEST REPORT

  30. SAMPLE TEST REPORT SAMPLE TEST REPORT

More Related