1 / 71

Prepared By: Liqun Ma 1 Chris Velden 2 Tim Olander 2 Jeff Key 3 Michael Turk 1

Upgrades to the Advanced Dvorak Technique (ADT) product Critical Design Review (CDR) August 23, 2017. Prepared By: Liqun Ma 1 Chris Velden 2 Tim Olander 2 Jeff Key 3 Michael Turk 1 1 NOAA/NESDIS/OSPO 2 UWisc -CIMSS 3 NOAA/NESDIS/STAR. Review Outline. Introduction Requirements

fergusons
Download Presentation

Prepared By: Liqun Ma 1 Chris Velden 2 Tim Olander 2 Jeff Key 3 Michael Turk 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Upgrades to the Advanced Dvorak Technique (ADT) productCritical Design Review (CDR)August 23, 2017 Prepared By: Liqun Ma1 Chris Velden2Tim Olander2 Jeff Key3Michael Turk1 1NOAA/NESDIS/OSPO 2UWisc-CIMSS 3 NOAA/NESDIS/STAR

  2. Review Outline • Introduction • Requirements • Software Architecture • Quality Assurance • Risks and Actions • Summary and Conclusions

  3. Section 1 – Introduction Presented by Chris Velden Liqun Ma UW-CIMSS NOAA/NESDIS/OSPO

  4. ADT Background • The ADT is a computer-based technique, developed at the University of Wisconsin-Madison/Cooperative Institute for Meteorological Satellite Studies (UW/CIMSS), to objectively determine tropical cyclone (TC) intensity using operational geostationary satellite infrared imagery. The ADT operates globally, and can classify TC intensity from storm formation, through development and maturation, to dissipation. • The ADT is patterned after the Subjective Dvorak Enhanced Infrared ‘EIR’ Technique (SDT) (Dvorak, 1975, 1984), which makes use of various pattern identification schemes and rules to determine TC intensity. The ADT was originally developed to closely mimic the SDT methodology in terms of intensity determination protocol and the incorporation of various rules and analysis methods. Some of the original SDT rules have subsequently been modified in the ADT as determined by in-depth statistical analysis of ADT performance during application. In addition, selected intensity determination modules within the ADT have adopted regression based methods, further advancing the ADT beyond the scope of the original SDT procedures and capabilities, as well as previous versions of the ADT; the Objective Dvorak Technique (ODT) (Velden et al., 1998), and Advanced Objective Dvorak Technique (AODT) (Olander et al., 2004).

  5. ADT Background • Occasionally-upgraded versions of the ADT have been run in real-time experimental mode at UW-CIMSS since 2005. • The ADT was implemented into NESDIS operations and run since 2007, with occasional upgrades provided by UW-CIMSS. • Major ADT upgrade (V8.1.4) implemented into NESDIS operations in 2013. • Major ADT upgrade (V8.2.1) implemented into NESDIS operations in 2016. • Next expected major upgrade: 2018-19.

  6. ADT Project Objectives Next ADT upgrade — summary: • Upgrade ADTv8.2.1 to ADTv9.0 • Will incorporate new version of ARCHER (automated, multi-satellite-based objective TC center fix algorithm) -- Will operate with GPM and GCOM MW inputs • Capability to process full resolution Himawari-8 and GOES-R data • Improved analysis of Sub-Tropical (ST) designated cyclones • Improved analysis of Extratropical Transition (ET) events • Will include estimates of TC surface wind radii (max, R34/50/64 nmi.)

  7. Project Stakeholders • IPT Lead: Jeff Key (STAR/ASPB) • IPT Backup Lead: Liqun Ma (OSPO) • NESDIS Team: • STAR: Jeff Key • OSPO: Liqun Ma, Wayne Mackenzie, Limin Zhao • JPSS: Arron Layns • GOES-R: Jamese Sims, Steve Goodman • NCEI: N/A • Others: Chris Velden (UWisc.-CIMSS), Tim Olander (UWisc.-CIMSS), YufengZhu (GST) • User Team • Lead: John Beven (NHC) • Others: Brian Strahl (JTWC), Michael Turk(OSPO/SAB) • Oversight Panel (OP) lead: ICAPOP

  8. Plan of Operations Jul 16 (Jul 16) Development Phase Begins Jul 16: IPT Lead informed to begin product development Jul 16: Develop GMI, AMSR2 data stream at CIMSS Aug 16: Start running the prototype ADT V9.0 internally at CIMSS Sep 16: Quality Monitoring Concept defined Sep 16: Long-term Maintenance Concept defined Feb 17: Run prototype ADT V9.0 at CIMSS and post results to the web; coordinate with JTWC/NHC/CPHC and SAB Mar 17: Development processing system defined Jun 17: Revise code accordingly Aug 17: Preliminary Design Review/Critical Design Review Aug 17: Run ADT V9.0 code in real-time demonstration mode at CIMSS Dec 17: Get prototype ADT V9.0 evaluation/feedback from NHC/JTWC/CPHC Feb 18: Code is prepared for implementation at NSOF Mar 18: Operational and backup processing defined May 18: Algorithm Readiness Review Jun 18 (Jun 18) : Pre-operational Phase Begins Jun 18: Software Code Review Jul 18: Run ADT V9.0 algorithm at NSOF on test cases Aug 18: Implement monitoring code, integrate monitoring with web page design Sep 18: Operational and backup processing capabilities in place Nov 18: Final IT Security Concept defined Jan 19: Run ADT V9.0 at NSOF on real time cases in experimental mode; evaluate vs CIMSS output Mar 19: Pre-operational product output evaluated & tested Apr 19: Operational Readiness Review May 19: Brief SPSRB that algorithm is ready to go operational Jun 19 (Jun 19): Operational Phase Begins Jun 19: SPSRB declares product operational SPSRB Secretaries/manager update the SPSRB product metrics web pages

  9. CDR Entry Criteria • Requirements Allocation Document Waived • CDR Presentation ftp://satepsanone.nesdis.noaa.gov/tropics/

  10. CDR Exit Criteria • Critical Design Review Report • The CDR Report (CDRR), a standard artifact of the STAR Enterprise Process Lifecycle (EPL), will be compiled after the CDR • The report will be a critical artifact for the Algorithm Readiness Review • The report will contain: • Review Item Disposition containing all risks, actions and comments • Updated CDR presentation

  11. Review Outline • Introduction • Requirements • Software Architecture • Quality Assurance • Risks and Actions • Summary and Conclusions

  12. Section 2 – Requirements Presented by Liqun Ma OSPO

  13. Requirement(1) • 1412-0015, “Continue operational transition and upgrade support of the ADT,” Submitted by Jack Beven, NHC. NOAA/NWS/NCEP/NHC and CPHC, and the Joint Typhoon Warning Center (JTWC) jointly request the continued support by NESDIS/SAB of the Advanced Dvorak Technique (ADT), to help meet the needs of the nation’s tropical cyclone warning program. The ADT (a computer-based objective algorithm to estimate tropical cyclone (TC) intensity) is an operational algorithm that provides crucial objective intensity estimates to augment the subjective classifications performed by satellite analysts. The NHC/CPHC/JTWC strongly endorses the continued development of the ADT, and requests NESDIS to continue the operational transition and follow-up upgrade implements in conjunction with the ADT developers at UW-CIMSS. • JPSS L1RDS-2260: The JPSS shall support modifications to ESPC blended products • JERD 2335: The NESDIS ESPC shall incorporate JPSS and GCOM-W data into the blended satellite products identified in Appendix D.

  14. Requirement(3) • Monitoring/Distribution/Archive/Maintenance remain the same (no change)

  15. Review Outline • Introduction • Requirements • Software Architecture • Quality Assurance • Risks and Actions • Summary and Conclusions

  16. Section 3 – Software Architecture Presented by Tim Olander UW-CIMSS

  17. ADT Architecture • Hardware Environment • Data Files • Input Files • Static/Ancillary Files • Output Files • Run Files • Software Description • External Interfaces • System Level • Unit Level

  18. Research and Operational System Environments

  19. ADT System Requirements

  20. ADT UpgradeDevelopment Hardware • ADT Upgrade Development Hardware - Unit tests, and input data ingesting will be conducted on the ADT development machine at CIMSS.

  21. ADT Upgrade Test Product Distribution • CIMSS webserver - ADT test products will be available on a webserver from CIMSS http://tropic.ssec.wisc.edu/misc/adt

  22. ADT Upgrade Integration and Production Hardware • NDE Development server – The new ADT system testing and integration will be conducted on the NDE development server.

  23. ADT UpgradeIntegration and Production Hardware • NDE Test/Production Hardware - After successful system tests, our understanding is that NDE plans to check the code into configuration management and then promote it to the Test/Production machine. This machine will be located at NSOF

  24. ADT Data • The tables on the following slides show all the ADT input and output. • All output will be in ASCII, GIF, and ATCF format.

  25. ADTInput Data Files

  26. ADTAncillary/Static Data Files(continued)

  27. ADT Output Data Files

  28. ADT Output Data Files (continued)

  29. ADT Output Data FilesRoutine processing output

  30. ADT Output Data FilesPMW Eye Score processing output

  31. ADT Output Data FilesARCHER processing output

  32. ADT Output Data FilesADT 2D Surface Wind Field processing output

  33. ADT Output Data Files - Users 34

  34. ADTResource and Log Files

  35. ADT UpgradeSoftware: External Interfaces • The ADT system will be run via the execution of two driver scripts that will be invoked, monitored, and managed by the NDE DHS Product Generation Manager. The two driver scripts are: ADT and ARCHER • Execution of the scripts will be at specific time: ADT runs every 30 minutes ARCHER runs every 60 minutes • The driver scripts will require PCF files containing parameters such as: • The input and output file names • Other information about how to run ADT under NDE environment (need to discuss with STAR ASSISTT and NDE integration teams).

  36. ADT UpgradeSoftware: External Interfaces • The driver scripts will run, parse the PCF content, run the compiled ADT algorithm code, handle program output and errors, direct required error codes to the DHS via an output log (and through the driver script’s return code), and generate a PSF. • If there are errors, our understanding is that NDE plans to save the contents of the run in a forensics repository. • Our understanding is that NDE plans to manage and direct error status to the operators from the DHS system. • Our understanding is that NDE plans to manage all distribution through the PDA.

  37. ADT Upgrade: External Interfaces

  38. ADT Software: System Level • The main ADT driver script is a single ksh script that acts as a wrapper for the compiled ADT code. • All of the basic directory pathnames are defined in the script using the export command and are defined within a separate file (adtenv) • All standard output and standard error messages are directed to a single log file that can be read to obtain any error or warning messages. • Certain storm-specific output is directed towards a separate and distinct output file (such as the PMW score log file)

  39. ADT Software: System Level • The list of current, active tropical cyclones to process are located in an external system file generated completely separate from ADT processing (and not part of the ADT processing package) • The cron file will execute the following scripts: • Run ARCHER processing script ARCHERmain.scr • Run in cron every hour • Will write all output to time-stamped output file for error debugging, if necessary • Will process three Geostationary satellite channels (IR, SWIR, Visible), two MW channels (85-92GHz and 37GHz channels), and ASCAT scatterometer (optional) • Will output consolidated analysis output to date/time stamped file for display on the ARCHER processing page. Graphical displays of each analysis are also created. • Will determine best storm position from choices above and append to storm log file • Final ‘best position’ map is created versus current Best Track

  40. ADT Software: System Level • Run main ADT processing script runADT.scr • Will run every 30 minutes • Will write all output to time stamped output file for error debugging, if necessary • Script to determine which current storms are currently classified as Subtropical is executed • Will create separate system file which contains storms ADT will process (Subtropical and Tropical systems) • ARCHER does not run on Subtropical systems; extrapolated Best Track is used as first guess for storm center) • Obtain external FSU Phase Space files and run ET determination scripts • Download latest PMW files and, if new, determine if current storm is within swath(s). • Run ADT derivation script • Latest PMW and ARCHER information (and ET/ST designations) are passed into ADT via command line interface • 2D surface wind analysis is performed and output products generated • Additional ADT diagnostic output generated (History file listing and storm history plot) • ATCF file generated for current ADT analysis and distributed to users via FTP • Derive PMW graphical analysis products and format webpage • Update storm webpage with current analysis and all diagnostic products (including ARCHER, PMW, and 2D surface wind displays)

  41. ADT Processing Overview Is Current Storm ST? Yes Get ATCF Best Track File Define Active TCs Run ARCHERAnalysis Active Storms List No Determine Any Active STs Storm Listing and Graph ARCHER Log Files Run ADT Get ATCF Forecast File 2DSurfaceWind Analysis ARCHER Displays PMW Analysis Output Run PMW Analysis PMW Analysis Products Run ET Analysis ET Log File Another Storm? No End ADT Processing Yes

  42. ADT Software: Unit Level • ADT software components • ADT ARCHER run • ADT PMW run • ADT ST/ET run • ADT Intensity run

  43. ADT ARCHER Processing Overview ARCHER Processing Create Analysis Displays Active Storms List Define Active TCs Create Position Listing Read ATCF Forecast File Adjust PMW/Scatt Data to Current Time Create Storm Track File Run ARCHER on Geo Data Geo Imagery Determine Best Position Run ARCHER on PMW Data PMW Data End ARCHER Processing Append Best Position to ARCHER log file Run ARCHER on Scatt Data Scatt Data

  44. ADT PMW Processing Overview PMW Eye Score Processing Calculate Eye Score Active Storms List Define Active TCs Create PMWAnalysis Graphics Read ATCF Forecast File Another Storm To Process? Another Satellite To Process? No No Yes Yes Download MW Data for Satellite Being Examined Determine Latest Overpass From All Satellites Is TC in PMW swath? No Yes End PMW Eye Score Processing

  45. ADT ET Processing Overview ET Processing Active Storms List Define Active TCs Download FSU Phase Space files for storm Update current ET status file Determine current ET status Another Storm To Process? End ET Processing Yes No

  46. ADT Algorithm Overview Run ADT Read Topography File Intensity Analysis? No Command Line Inputs List History File? Yes List File Read IR Satellite Image Yes No Use PMW? Graph History File? Extrapolate ARCHER Position Read PMW Score Determine Scene Type Yes Graph File Yes No No Read ATCF Forecast File Determine Intensity Read ARCHER Log File Output Storm History File Yes Use ARCHER? Alpha Score > Threshold Yes 2D Wind Analysis? End ADT Processing No No Output 2D Surface Wind Analysis Interpolate Forecast Position

  47. ADT Scene Type Overview Utilize History File? Initialize variables with default values No Calculate distance to shear Shear? Initialize variables and allocate memory Yes Yes Read ADT history file No Perform Fourier Transform Analysis on Eye and Cloud Region Tbfields Curved Band? ST Storm? No Yes Initialize variables based on previous intensity estimates Yes Adjust Minimum Gray Scale Level Derive Eye and Cloud Region “Scene Score” values No Calculate Cloud Region convective symmetry and Eye Region standard deviation values Calculate CDO size; Check for Embedded Center, Irregular CDO, or Pinhole Eye Measure cloud curvature w/ 10° Log Spiral Classify scene type Eye/Non-Eye scene type? Determine various eye and cloud region difference values Non-Eye Eye Calculate eye size End

  48. ADT Intensity Overview Utilize History File? Calculate Initial Raw T# Value (based upon eye/cloud region scene types) No Yes Utilize PMW Eye Score Adjustment? Modify Current/Previous Raw T# Intensity Values (Also Modify Current Adjusted and Final T# values) Get Latest PMW Eye Score Information Yes No Calculate Adjusted Raw T# Value (Apply Dvorak Rule 8 rules) Calculate Current Intensity CI# Value (Apply Dvorak Rule 9 Rules) ET Storm? Calculate Final T# Value (3-hour average) No Yes Apply ET Adjustment End

  49. ADT Software: Error Handling • All steps in ADT processing are output to various output files • Files are date/time stamped for easy examination if a processing error is encountered. This is true for ADT and ARCHER analysis (PMW analysis processing is contained within ADT output files). • Storm specific processing output are also output for examination, but are not currently date/time stamped (only latest processing output file is retained).

More Related