1 / 20

Review of the Product Quality and Validation phase of V1 Stream1 Production Centres

Review of the Product Quality and Validation phase of V1 Stream1 Production Centres. Context Quality Assurance & Review Group review of the ScVR Validation outcomes, remarks. X.5. Dedicated ressources planned for Cal/Val. Coordination of x.5 activity inside Production Units

armani
Download Presentation

Review of the Product Quality and Validation phase of V1 Stream1 Production Centres

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review of the Product Qualityand Validation phase of V1 Stream1 Production Centres Context Quality Assurance & Review Group review of the ScVR Validation outcomes, remarks

  2. X.5 Dedicated ressources planned for Cal/Val Coordination of x.5 activity inside Production Units Provide diagnostics for scientific assessment in coordination with QuARG reviews

  3. Context of the validation phase • Every PC has dedicated ressources to perform scientific assessment activities for: • Monitoring the V1 system performance during operations • Assessing the quality of V1 Product disseminated to users • The V1 Stream1 subsystem cal/val tools succeeded the VARR: the calibration phase proved already that: • Validation activity is in place in PC/PU • V1 Stream1 system offer an upgraded quality compared to V0 • In parallel, following the Stream2 V1 system agenda: • V1 S2 calibration phase has been performed • Based on similar system to V1, Reanalysis and Reprocessed Analysis have been performed and scientifically validated (strengthening validation synthesis for the operational system) • This has been reviewed with success at the PC Acceptance Stream2 V1.

  4. Objective of the validation phase • Verifythatscientific validation procedures are running operationnalyat PC/PU level to asses the productquality • Verifythattheseprocedures are used, and the productqualityis as expected  Ask for a Scientific Validation Report, based on the routine validation performedsince the 15th of December 2010

  5. Cal/Val activities for the second MyO year ….. V1S1 devlpt. V1S1 scientific assessment ….. V1S1 pre-operated ….. V1S1 in operation June 2010 End September 2010 15/12/2010 29/4/2011 Prepare ScCP, ScVP… Test op. validation Operate routinely PQ validation: reg. reporting PC Acceptancereview of ScCP, ScVP, ScCR VAlid. Read. Rev. Start V1 S1 operation P. Read. Rev. Review V1 S1 ScVR ….. V1S2 devlpt. / REA V1S2 & REA scientific assessment ….. V1S2 in operation March 2011 June 2011 Sept. 2011 Calibration Validation PC Acceptancereview of ScCP, ScVP, ScCR, ScVR-Rea VARR Start V1S2 operation PRR V1S2 ScVR • Validation requirements: • Compliance of routine validation activity with ScVP • Identify and report about success/defaults of V1 • Pay attention to V1 performance versus user requirements (if any) • Consider dependencies (ie, quality/impact of input data for the period)

  6. Status of the Calibration and Validation PC Acceptance Stream1 V1, Roma, September 2010 VARR Stream1 V1, Brussels, December 2010 PC Acceptance Stream2 V1, Copenhagen, March 2011 User Forum, Stockholm, April 2011 PRR Stream1 V1, Roma, April 2011 VARR Stream2 V1, June 2011 PRR Stream2 V1, September 2011

  7. Flavour of V1 Product Quality assessment, from reporting • Every Production Centres fulfulling the scientific validation, some delays in delivery, but all project management procedures not already efficient: • GLO MFC: 3 PU, detailed and valuable validation • ARC MFC: detailed and valuable validation • MED MFC: detailed and valuable validation • SL TAC: detailed and valuable validation • OC TAC: detailed and valuable validation • SST TAC: ScVP expected. Detailed and valuable validation • SIW TAC: 7 doc. for sea ice. No doc for wind –> No delivery justified by Quickscat failure in 11/2009. Assessment for other products scheduled in V2 • INS TAC: no ScCR. But RTQC performed regularly. Particular attention also paid to reprocessed dataset

  8. Flavour of the V1 Product Quality assessment: SST TAC SST13

  9. Expected outcome from calibration:accuracy levels per product (ex. SST TAC summary) Slide of the VARR Comparison to match-up database

  10. ScCR results: MEDProducts example OC12 Slide of the VARR Location of the MED in situ-satellite matchups

  11. Flavour of the V1 Product Quality assessment: OC TAC OC12

  12. Flavour of Product Quality assessment, from reporting: proposing new approach: OC TAC and MED MFC validating Chl-a MED 9 From MED ScVR

  13. Flavour of the V1 Product Quality assessment: ARC MFC From ARC ScVR-REA ARC 5

  14. Flavour of the V1 Product Quality assessment: GLO MFC In this section we display examples of CLASS1 routine consistency check of oceanic large scale signals and mesoscale structures. Several dates were chosen in order to illustrate the major good points and biases that have been diagnosed or monitored since the operational launch of V1. The plots presented here follow the CLASS1 metrics philosophy as they compare the model outputs and observations on a same grid. The MyOcean products themselves are interpolated on standard grids and in GODAE regions and constitute the “official” CLASS1 files for intercomparisons of models. We can see in Figure 1 a very good agreement between both Global and HR-Zoom systems and SLA observations. Nevertheless strong discrepancies are visible in the English Channel, the North Sea and the Baltic Sea. This illustrates one of the current limitations of the system which produces a spicy bias in the North Sea and the Baltic. Spicy biases are also observed in the Celtic Sea GLO 4

  15. Flavour of the V1 Product Quality assessment: GLO MFC GLO 4

  16. Flavour of the V1 Product Quality assessment: GLO MFC Backup GLO 4

  17. Flavour of V1 Product Quality assessment, GLO PHYS PU Synth Comb WOA09 GLO 4

  18. Product Quality Assessment: V1 Stream1 Scientific Validation Review • Reporting has been performed, reporting mechanism had some minor deficiencies • The scientific validation procedures, and the corresponding reporting is: • Scientifically valuable • Reliable, and proved to be performed anytime (secured by many types of diagnostics) • Fair: good and bad aspects are addressed and reported • Consistent with plans and project management • Offering a continuity with system and products assessment (refer to calibration phase, to V0, compliant with MERSEA heritage) • Clear, providing useful synthesis for users • The scientific validation of V1 Stream1 is effective, and corresponding to QuARG expectation

  19. Remarks, considerations, and ongoing work • This huge effort of validation has never been done elsewhere in the ocean community • Scientific validation of reanalysis products is promising: a strong effort is done to provide common validation strategies (Global reanalysis) that might be positively considered by the international community • The validation reporting is not fully organised for V1 and V2, but interest to maintain regular reporting is obvious, and will prepare MYO-2 operational validation

  20. Link with users • Reporting is established, and could support regular information to users • Request from the User Forum in Stockholm, due to effort in reporting, actions can be commited for: • Estimated accuracy level (september 2011) • QUID (october 2011)

More Related