1.96k likes | 2.15k Views
NetCDF4 Reformatting Toolkit (N4RT): BUFR and GRIB2 Tailoring Test Readiness Review (TRR) for Phase 1 SDR Products October 25, 2011. Prepared By: Tom King 1 , Yi Song 1 , 1 Kexin Zhang, Larisa Koval 1 , and Walter Wolf 2 , 1 Riverside, 2 STAR. Purpose of TRR/CTR.
E N D
NetCDF4 Reformatting Toolkit (N4RT): BUFR and GRIB2 TailoringTest Readiness Review (TRR) for Phase 1 SDR ProductsOctober 25, 2011 Prepared By: Tom King1, Yi Song1, 1Kexin Zhang, Larisa Koval1, and Walter Wolf2, 1Riverside, 2STAR
Purpose of TRR/CTR • Review the status of all the open issues/risks • Review Project Requirements • Describe the Software Architecture • Describe the tests for the software units and show the test results • Establish the contents of the Delivered Algorithm Package • Identify any new issues/risks
Review Outline • Introduction • CDR review report/actions • Requirements • Quality Assurance • Software architecture • Unit Tests • Delivered Algorithm Package • Risks and Actions Summary • Summary and Conclusions
Section 1 – Introduction Presented by Walter Wolf NOAA/NESDIS/STAR
Introduction • Project Background • IJPS • NPP/JPSS • NDE • Project Objectives • Integrated Product Team • Project Plan • Entry and Exit Criteria
JTA and IJPS • JTA – Joint Transition Activities • JTA is a replacement of the Initial Joint Polar-Orbiting Operational Satellite System (IJPS) agreement and is designed to cover only the NPP mission. • IJPS started with NOAA-N and covers the MetOp series. JTA and IJPS are cooperative efforts between NOAA and EUMETSAT to provide and improve the operational meteorological and environmental forecasting and global climate monitoring services worldwide. • The JPSS J1 and J3 data availability will be covered by the Joint Polar-Orbiting Operational Satellite System (JPS) agreement.
NPP/JPSS • NPP and JPSS, a joint NOAA/NASA effort, is the next series of polar-orbiting satellites dedicated to among other things, operational meteorology. The objective of the JPSS mission is to ensure continuity, improvement and availability of operational observations from an afternoon polar orbit (13:30 pm). • Meteorological/Climatological Instrument packages on NPP/JPSS: • CrIS, ATMS, VIIRS, OMPS, CERES • NPP is the first of 3 missions with a launch date of October 28, 2011.
Project BackgroundNDE • Disseminate NPP/JPSS Data Records to customers. • Generate and disseminate tailored NPP/JPSS Data Records (versions of NPP/JPSS Data Records in previously agreed alternative formats and views). • Generate and disseminate NOAA-unique products (augmented environmental products constructed from NPP/JPSS Data Records). • Deliver NOAA-unique products, product processing elements, and associated metadata to CLASS for long-term archiving. • Provide services to customers, including NDE product training, product enhancement, and implementation support across NOAA. • Provide software for NPP/JPSS Data Record format translation and other data manipulations.
N4RT Project Objectives • To build a software package that will tailor JPSS and NDE products from NetCDF4 into BUFR and GRIB2 formats in support of NDE’s overall tailoring efforts. • The NetCDF4 Reformatting Toolkit (N4RT) must be designed so it can easily be modified/expanded to incorporate the tailoring of new products. • Flexible • Extendable • The software must be able run within the NDE system architecture and operate within the NDE functional guidelines. • Output product formats and content must meet the needs of NOAA customers.
N4RT Project Objectives Phase 1 Products • Phase 1 SDR • CrIS Radiances (BUFR) • ATMS Radiances (BUFR) • VIIRS Radiances (BUFR) • Phase 1 EDR • Sea Surface Temperature (BUFR) • Aerosol Optical Thickness (BUFR) • Nadir Profile and Total Column Ozone (BUFR) • Phase 2 EDR • Polar Winds (BUFR) • Green Vegetation Fraction (GRIB2) • Phase 3 EDR • OMPS Limb Profiles (BUFR) • AVHRR, GOES, VIIRS Cloud Products (GRIB2)
Integrated Product Team • IPT Lead: Walter Wolf (STAR) • IPT Backup Lead: AK Sharma (OSPO) • NESDIS team: • STAR: Walter Wolf, Jaime Daniels, Yi Song, Thomas King, Kexin Zhang, Larisa Koval • NDE: Jim Silva, GeofGoodrum, Richard Sikorski, Kevin Berberich • OSPO: Dave Benner, AK Sharma, Ricky Irving • OSD: Tom Schott • User team • Lead: Jim Heil (NWS), Stephen Lord (NWS /NCEP/EMC), John Derber (NWS/NCEP/EMC), Jeff Ator (NWS/NCEP/NCO), Lars Peter-Riishojgaardand Jim Yoe (JCSDA), Simon Elliott (EUMETSAT), Tony McNally (ECMWF), Fiona Hilton (UK-Met) • Others: International NWP users, NWP FOs, Climate Users • Product Oversight Panel: ZPOP, EPOP, ICAPOP, CAL/NAVPOP
Project Stakeholders • NOAA National Weather Service • Weather Forecast Offices • National Center for Environmental Prediction • Department of Defense • NRL • FNMOC • AFWA • Global NWP • EUMETSAT • ECMWF • UK Met • MeteoFrance • CMC • JMA • BOM
Plan of Operations – Phase 1 Year 1 – Design and Development Evaluate the requirements; work with NDE Discuss with the current developers of similar translators to determine what is required in their output files Design the NetCDF4 reformatting toolkit; distribute to OSPO and NDE for review Conduct PDR Create generic NetCDF4 readers and writers Develop BUFR tables and GRIB formats with the product teams for Phase 1 products Work with NDE to determine the interface between the SDR and EDR NPP products and the reformatter Conduct CDR
Plan of Operations – Phase 1 Year 2 –Transition to Pre-Operations of Phase 1 Products Set up infrastructure to implement the readers and writers for the data formats; work with NDE to determine the interface to the data handling system; make available to OSPO for review Implement BUFR tables and GRIB formats for the Phase 1 products on the NDE hardware; work with NDE and OSPO to evaluate the implementation Conduct Test Readiness Review for Phase 1 products Transition Phase 1 product reformatters to pre-operational system on the NDE hardware Test system within the NDE environment Prepare Documentation Conduct Code Review for Phase 1 products Year 3 – Transition to Operations of Phase 1 Products Evaluate with NDE and OSPO the implementation of the Reformatter within the NDE data handling system Conduct Algorithm Readiness Reviews (separate reviews for Phase 1 SDR and EDR products) Transition pre-operational Phase 1 product reformatting system to operations
Plan of Operations – Phase 2 Year 1 – Design and Development Evaluate the requirements; work with NDE Discuss with the current developers of similar translators to determine what is required in their output files Develop BUFR tables and GRIB formats with the product teams for Phase 2 products Conduct CDR Implement BUFR tables and GRIB formats for the Phase 2 products on the NDE hardware; work with NDE and OSPO to evaluate the implementation 16
Plan of Operations – Phase 2 Year 2 –Transition to Pre-Operations of Phase 2 Products Conduct Test Readiness Review for Phase 2 products Transition Phase 2 product reformatters to pre-operational system on the NDE hardware Test system within the NDE environment Prepare Documentation Conduct Code Review for Phase 2 products Year 2 – Transition to Operations of Phase 2 Products Evaluate with NDE and OSPO the implementation of the Reformatter within the NDE data handling system Conduct Algorithm Readiness Review for Phase 2 products Transition pre-operational Phase 2 product reformatting system to operations 17
Plan of Operations – Phase 3 Year 1 – Design and Development Evaluate the requirements; work with OSPO Discuss with the current developers of similar translators to determine what is required in their output files Develop BUFR tables and GRIB formats with the product teams for Phase 3 products Year 2 –Transition to Pre-Operations of Phase 3 Products Implement BUFR tables and GRIB formats for the Phase 3 products on the OPSO hardware; work with OSPO to evaluate the implementation Conduct Test Readiness Review for Phase 3 products Transition Phase 3 product reformatters to pre-operational system on the OSPO hardware Test system within the OPSO environment Prepare Documentation Conduct Code Review for Phase 3 products Evaluate with OSPO the implementation of the Reformatter within the OSPO data handling system Conduct Algorithm Readiness Review for Phase 3 products Transition pre-operational Phase 3 product reformatting system to operations 18
Project Timeline (1) PDR 04/28/2009 CDR 09/29/2009
Project Timeline (2) TRR 10/25/2011 ARR 11/28/2011 DAP 1 Delivery 11/30/2011
Project Timeline (4) TRR 01/23/2012 SCR 03/5/2012 ARR 05/30/2012 DAP 2 Delivery 06/20/2012
Project Timeline (6) TRR 07/18/2013 SCR 08/26/2013 ARR 03/10/2014 DAP 3 Delivery 03/31/2014
Project Plan – Schedule • Schedule (Milestones) • Project begins - 7/1/08 • PDR - 4/14/09 • CDR - 9/14/09 • Phase 1 SDR TRR/CTR - 10/25/11 • Phase 1 SDR ARR - 11/15/11 • Phase 1 SDR DAP Delivery - 11/30/11 • Phase 1 & 2 EDR TRR/CTR - 1/23/12 • Phase 1 & 2 EDR SCR - 3/5/12 • Phase 1 & 2 EDR ARR - 5/30/12 • Phase 1 & 2 EDR DAP Delivery - 6/4/12 • Phase 3 TRR/CTR - 7/18/13 • Phase 3 SCR - 8/26/13 • Phase 3 ARR - 3/10/14 • Phase 3 EDR DAP Delivery - 3/31/14
N4RT TRR Entry Criteria • CDR Report (Review Item Disposition) http://www.star.nesdis.noaa.gov/smcd/spb/iosspdt/qadocs/N4RT/NetCDF4_Reformatting_Toolkit_CDR_Review_Item_Disposition_20110818.xlsx • PDR Risks and Actions • CDR Risks and Actions • CDR Actions and Comments • Updated CDR Presentation http://www.star.nesdis.noaa.gov/smcd/spb/iosspdt/qadocs/N4RT/N4RT_CDR_20090914.ppt • Updated Requirements Allocation Document http://www.star.nesdis.noaa.gov/smcd/spb/iosspdt/qadocs/N4RT/N4RT_RAD_v1.2.docx • Review of N4RT http://www.star.nesdis.noaa.gov/smcd/spb/iosspdt/qadocs/N4RT/N4RT_TRR_Phase1_SDR_Products.pptx • Requirement Allocation • Quality Assurance • Software Architecture • Unit Tests and Results • Delivered Algorithm Package
N4RT TRR Exit Criteria • Test Readiness Review Report • The TRR Report (TRRR), a standard artifact of the STAR Enterprise Process Lifecycle (EPL), will be compiled after the TRR • The report will contain: • Review Item Disposition containing all risks, actions and comments • Updated TRR presentation
Review Objectives • Review the CDR Review Report (CDRR) • Focus on actions • Review the Requirements Allocation • Review the software system architecture • External interfaces and data flows • Dependency Tree Document • Review the software tests and results of each software unit • Identify risks and actions
Review Outline • Introduction • CDR review report/actions • Requirements • Quality Assurance • Software architecture • Unit Tests • Delivered Algorithm Package • Risks and Actions Summary • Summary and Conclusions
Section 2 – CDR Review Report and Actions Presented by Thomas King Riverside
CDR Reports and Actions • Open PDR Risks and Actions at CDR. • CDR Risks and Actions. • Later in the review, new risks, actions, and comments originating from the CDR, and the period since, will be presented.
CDR Review Report • A CDR Report (CDRR) is produced following a project’s Critical Design Review (CDR). It is a required project artifact of the STAR Enterprise Product Lifecycle (EPL). Specifically, it is a designated artifact for the Test Readiness Review (TRR), which is a standard Technical Review in the STAR EPL process. The intended target audiences are program management, the product development team, and the TRR review team. • A CDR Review Report is available for review in the project repository at: • http://www.star.nesdis.noaa.gov/smcd/spb/iosspdt/qadocs/N4RT/NetCDF4_Reformatting_Toolkit_CDR_Review_Item_Disposition_20110818.xlsx
PDR Risk • Risk 1:JPSS product formats and content are still changing, especially for VIIRS • Assessment: Low • Impact: May need to revise software several times during development to adjust to new formats, names, and types. • Likelihood: High • Mitigation: • Work through the Data Format Working Group to obtain information on format and algorithm updates. Monitor the latest copies of the Common Data Format Control Books (CDFCB) for any updates. Maintain contact with customers to inform them of any upstream product changes. Make the code design flexible so that changes in the upstream products translate into the minimum amount code revision. • Status:Closed. We recommend closing this as CDFCB formats are now frozen and the reformatter is able to read all necessary data from the P72 datasets to which NDE subscribes.
PDR Risk • Risk 2: The roles and responsibilities regarding who shall generate the set of required SPSRB documents for NDE has not yet been decided. • Assessment: Low • Impact: Difficult to budget time needed for the team to generate documentation. • Likelihood: Moderate • Mitigation: This issue, and that of document content, is being worked by Maurice McHugh, STAR, NDE, OSD, and OSPO personnel. Reformatting Toolkit developers intend to participate in these meetings and discussions. • Status:Closed. We recommend closing this. The updated SPSRB documentation has defined roles and responsibilities for writing of these documents on the STAR side. At TRR new project-level risk was opened to address NDE needing to modify their contract to complete this delivered documentation in time for transition to OSPO.
PDR Risk • Risk 3: There are small variations in the types of platforms and the versions of the compilers • Assessment: Low • Impact: May obtain different results using different compilers. • Likelihood:Low • Mitigation: Reformatting Toolkit developers will work with NDE during system tests in the integration and production phase to ensure that those results match the results from the units. The NDE Build Content Reviews and delivery of DAP prototypes should help to resolve these issues early on in development. • Status: Open
PDR Risk • Risk 4: Data format translation may involve some unit conversion and possible reduction of precision (significant digits) • Assessment: Low • Impact: Any modification of the data from its original form may not be apparent to the user. • Likelihood: Low • Mitigation: Document data manipulations in the NDE Delivered Algorithm Package (in place of ATBD). This will be done in the System Maintenance Manual (SMM). • Status: Open
PDR Risk • Risk 6: Risk on NDVI and snow mask. Have not yet demonstrated ability to encode GRIB2 files. • Assessment: Moderate • Impact: Failure to meet the requirement to have demonstrated GRIB2 tailoring capability. • Likelihood: Low • Mitigation:NDVI and snow mask will not be produced. However, GVF will need to be in GRIB2 format, but that’s something that will not be a capability of the toolkit until Phase 1 EDR DAP. This capability will be demonstrated at the Phase 1 EDR & Phase 2 EDR products TRR scheduled for 1/23/2012. • Status:Open.
CDR Risk • Risk 14: STAR and NDE CM need to collaborate to discuss formats, defining fields, the process for delivering DAPS with an official naming convention. • Assessment:Low • Impact:Confusion and delays caused by not having common naming conventions and not having standard DAP delivery process defined. • Likelihood: Low • Mitigation:STAR and NDE CM need to collaborate to discuss formats, defining fields, the process for delivering DAPS with an official naming convention. • Status:Closed. We recommend closing as all this is covered in the NDE document entitled “Algorithm Delivery Standards, Integration, and Test Version 1.3”.
CDR Risk • Risk 15: Need to identify an independent code review • Assessment:Low • Impact:Reduction in the quality of the delivered software. • Likelihood: Low • Mitigation:Document plan for an independent code review. Understand NESDIS QA process is being defined. • Status:Closed. We recommend closing this as this project schedule has TRR/CTRs for each phase and has added software reviews (code walk-throughs) for all future phases. OSPO and NDE are invited to participate in these reviews and OSPO will be expected to participate in the software reviews. The question is how much effort can groups outside STAR afford to invest given current budgets. If the reviewers agree this can be sufficient, this risk can be closed.
CDR Risk • Risk 16: EMC are concerned that the conversion of HDF5 to netCDF4 may add significant time to the delivery of the products. • Assessment:Low • Impact:Increase in product latency and failure to meet customer needs. • Likelihood: Low • Mitigation: Need to conduct tests to verify that the conversion times meet latency requirements. • Status:Closed. We recommend closing this given that the tests shown here demonstrate that the conversion times were sufficient to maintain latency.
CDR Risk • Risk 17: Encourage NWS (EMC, AWIPS) personnel to take part in future reviews, major meetings, as well as working-level meetings. JCSDA and EMC were lightly represented in the CDR. • Assessment: Low • Impact:Changing user needs may not be regularly communicated to the developers. • Likelihood: Low • Mitigation: May need to establish a working group that facilitates this communication. • Status:Closed. We recommend closing this as regular monthly EMC/NDE/STAR meetings are held (Kevin Berberich). In addition, EMC & JCSDA representatives are always invited to this project’s reviews. Tom Schott’s VIIRS TIM helped resolve many remaining VIIRS SDR BUFR issues for Phase 1.
CDR Risk • Risk 18: ESPC may not have the BUFR expertise to maintain the toolkit. • Assessment: Low • Impact:OSPO may not be able to maintain the toolkit in the future. • Likelihood: Low • Mitigation: OSPO is getting FY12 funding to support the transition and maintenance from NDE on the OSPO side. • Status:Open. At TRR it was decided that this is a project-level risk that should remain open. The N4RT project plan anticipates STAR will provide upgrades and maintenance through 2014.
CDR Risk • Risk 19: Should OMPS NP and TC be in the same BUFR file? • Assessment: Low • Impact:Flexibility of FOV for OMPS NP pose a problem for converter if combined. • Likelihood: Low • Mitigation: These will be in separate files. • Status:Closed. We recommend closing this.
Summary • There are 11 risks: • 5 from PDR • 6 from CDR • We recommend closing: • 2 of the PDR risks. • 5 of the CDR risks. • 4 risks will remain open.
Review Outline • Introduction • CDR review report/actions • Requirements • Quality Assurance • Software architecture • Unit Tests • Delivered Algorithm Package • Risks and Actions Summary • Summary and Conclusions
Section 3 – Requirements Presented by Thomas King Riverside
Requirements Overview • SPSRB Requirements were presented to the developers in a document entitled: “Level 1 Requirements for a NetCDF4 Reformatting Tool” (Version 1.5). • Product requirements have been added to those from the SPSRB and are presented here as well. These additional requirements were obtained in a series of meetings between the developers, the customers (EMC/JCSDA and EUMETSAT) and the heritage product teams. • Using all of this information, a Requirements Allocation Document (RAD) has been generated for the Reformatting Toolkit project. • All current Phase 1 and Phase 2 requirements are listed, but we’ll focus mainly on those for Phase 1 SDR. • Text highlighted • Yellow indicate basic requirements • Orange indicates new, modified, or removed requirements since CDR.