1 / 18

STAR’s Involvement with CrIMSS EDR Algorithm Optimization, Evaluation and Validation

STAR’s Involvement with CrIMSS EDR Algorithm Optimization, Evaluation and Validation.

bonner
Download Presentation

STAR’s Involvement with CrIMSS EDR Algorithm Optimization, Evaluation and Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAR’s Involvement with CrIMSS EDR Algorithm Optimization, Evaluation and Validation ByMurty Divakarla*, Mike Wilson*, Xiaozhen Xiong*, Changyi Tan*, Eric Maddy#, Antonia Gambacorta@, Nick Nalli+, andFlavio Iturbide*IM Systems Group, Inc. at NOAA/STAR Xu Liu* and Susan Kizer*Langley Research Center Degui Gu*, Denise Hagan*, and Xia L Ma*Northrop Grumman Aerospace Systems Chris Barnet& and Tony Reale%% NOAA/STAR Contact &Formerly with NOAA/STAR and currently with STC, Columbia, MD *Core group for CrIMSS-EDR algorithm Implementation, evaluation, Improvements through Discrepancy Reports to JPSS, user support, and, data source for Focus-Day(s) correlative data sets# STAR in-house Aqua-AIRS retrieval key consultant and data sets @Initial help on CrIS/ATMS bias-tuning data set +Coordinator for AEROSE and ARM/CART dedicated RAOB campaignsPoint of Contact: Murty.Divakarla@noaa.gov, Tony.Reale@noaa.gov

  2. STAR’s Involvement in Association with LaRC and NGAS • Focal point for the coordinated activities from pre-launch era to post-launch Provisional Maturity for CrIMSS EDR algorithm improvements. • Provided focus-day correlative data sets to evaluate CrIMSS EDR products from “pre-launch proxy” to “post-launch” era. • Evaluated CrIMSS EDR products with ‘truth’ data sets and ‘heritage’ algorithm products leading to Beta Maturity and Provisional Maturity. • Pathway for IDPS operational implementations in coordination with NGAS through DRs and CCRs to realize MX7.1, and future builds. • Approach • Leverage existing utilities and utilize experience gained through AIRS and IASI validation systems. • Analyze ‘focus day’ correlative data sets from time to time (11/11/2011; 2/24/2012; 05/15/2012, 09/20/2012, 02/03/2013) , identify issues with CrIMSS EDRs, suggest/test/improve CrIMSS EDR algorithm • Risk Reduction- Consistency in CrIMSS EDR Codes (Science Code, IDPS-Operations, and Ported off-line), ADL

  3. Utility of STAR-Supplied Focus-Day Data Sets Towards Optimization and Characterization of CrIMSS EDRs(Related Excerpts from AMS-2013 Presentations) • Based on the results so far achieved • What updates and improvements can be made to the CrIMSS EDR algorithm? • Whether current data sets are adequate to optimize and characterize CrIMSS EDR Algorithm? • What we need more?

  4. MX7.1 vs. AIRS V6JJ, V6KK, NUCAPS- PHYONLYExcept CrIMSS, ALL other systems uses Statistical Regression as First GuessSimilar plot can be seen in Provisional Maturity Brief by Chris (slide 41) CrIMSS IR+MW MX7.1 (solid red) NUCAPS PHY-ONLY NUCAPS-PR (solid) NUCAPS-FG (solid) AIRS V6JJ Climate AIRS V6 JJ FG (dashed) AIRS –V6 PR NUCAPS PHY UsingReg. FG NUCAPS Phy-Only Global ALL Yield: CrIMSS: 50% NUCAPS: 53% NUCAPS-PR:57% NUCAPS-FG:57% AIRS V6JJ: 86% CrIMSS MX7.1 AIRS –V6 FG T(p) RMS (K) q(p) RMS (%)

  5. CrIMSS vs. ECMWF; AIRS V6 vs. ECMWF Matched EDRs - Global Ocean – CLDCLR, Clear Cases(Murty, AMS-2013) CrIMSS vs. Heritage Algorithm (AIRS) Solid Lines CrIMSS IR+MW Cloud-cleared Clear Dashed Lines AIRS V6 RET Pbest (Assimilation) Cloud cleared Clear AIRS Clr Dashed CrIMSS MX7 Ocean Cloud-Cleared Blue Ocean N= 116,000 -CLDCLR AIRS:43% dashed CrIMSS:58% solid Clear N= 5,538 AIRS: 5% dashed CrIMSS solid CrIMSS Clear Cases Corresponding to AIRS Clear Cases T(p) RMS (K) q(p) RMS (%)

  6. MX-7 T(p), q(p) RMS Scene Classification –Global 05/15/2012) 0 - cloud free (0% clouds) 1 - some cloud in the scene (between 0% and 100%, and less than or equal to three layers) CrIMSS IR+MW MX7 Global ALL (MX7) N=318,000 MX7 ALL: 50% Scene Cld (--) :50% Scene (Clr): 45% (10% of Total Sample compared to 3% of AIRS Ret) CrIMSS MX7.1 Cloud-free CrIMSS MX7.1 Global Scene Cloudiness QC(9) Flag CrIMSS cloud-free cases (10%) are slightly higher than expected, and improving delineation of cloud-free cases in scene classification can improve overall retrieval statistics. Cloud-clearing is very sensitive to MW first guess. Improvements to FG and new methodologies will improve overall retrievals.

  7. “CrIMSS EDRs – are resilient to Dust” Heritage Algorithm vs. CrIMSS EDRs. (Murty et al., AMS-2013) Aqua-AIRS Dust Flag 05/15/2012 (Dust Flag Algorithm – S. DeSouza) • Further Verification on “CrIMSS EDRs resilient to dust” • Discussion on “CrIMSS EDRs resilience to dust” –Murty et al., AMS-2013; Telecon – Murty 11/05/2012 • Can be tested with CrIS/AIRS matches data set from 3 Focus Days • Filter out artifacts using • VIIRS Dust Fag - STAR/Aerosol’s group. • Trying to get digital data to filter artifacts of dust-flag in AIRS retrieval. • Getting GOCART Model (Sarah Liu) forecasts for verification Figure Courtesy : Shobha Kondragunta, Pubu Ciren, STAR VIIRS Daytime Dust Flag 05/15/2012:

  8. Updates and Improvements to CrIMSS EDR Characterization and Validation with Global RAOB Collocations for Focus Days Murty – JGR – AIRS validation 2006 Murty - AIRS/IASI comparisons, HISE, OSA, 2011 Instrument AIRS/CrIS RAOBs • NOAA STAR In-house Capability with Reprocessing Options (MGD) • Used for Aqua-AIRS, MetOP-IASI  Many publications • Initiating the process for CrIMSS EDRs on a routine basis • Global RAOB collocations from PrepBUFR files for all the 3 focus days • Instrument (CrIS) to truth (e.g. RAOB, and other correlative data set matches • Cross Platform instruments (e.g. CrIS to AIRS) to other correlative matches. • For 05/15, 09/20 and 02/03 • CrIMSS – SDR QF PGE, NG Sounder RAOB matchup PGE RAOB PGEs on NSIPS Web Portal – (Denise Hagan’s presentation on this March 12) • NPROVS • Rigorous Quality Checks • William G. Collins - Mesoscale Modeling Branch • QUALITY CONTROL OF SIGNIFICANT LEVEL RAWINSONDE TEMPERATURES AND PRESSURES http://www.ncep.noaa.gov/officenotes/NOAA-NPM-NCEPON-0005/01408986.pdf

  9. Aerospace RAOB (Kauai, 22.05N, 159.78W, Hawaii) RAOB, ECMWF, CrIMSS Versions (MX5.3, MX6.3, MX7.0) (Excerpt from AMS-2013 Murty’s Presentation) • We are currently evaluating CrIMSS EDRs vs. matched RAOBs. Preliminary evaluation reveals that the RAOBs are consistently warm with respect to matched ECMWF. Discussions are on-going with the Aerospace collaborators on some of the RAOB calibration issues. Attempts are also in progress in obtaining PREPBUFR RAOB data sets from the surrounding islands. • Data: Aerospace Corporation  through Nick Nalli Black: RAOB; Red: 2nd Stage IR+MW; Green: 1st Stage MW; Cyan: ECMWF

  10. Research to Operations through DR Process • How Did we Achieve Provisional Maturity? • Series of ‘CrIMSS’ Algorithm telecons between STAR, NGAS, and LaRC from ‘pre-launch proxy’ era to ‘present time’ • Contact Murty for Algorithm Teleconpresentations from 2010 onwards. • Investigations and suggestions for fixes by NGAS, LaRC, STAR, and other cal/val group members helped to come up with CrIMSS MX7.1 Version. • Synergistic use of off-line EDR code and ADL 4.1 helped to ensure off-line EDR Algorithm emulates exact IDPS/ADL 4.1 Versions • STAR-ADL group submitted CCRs for the DPE. Coordinated Efforts of CrIMSS Algorithm Evaluation Team (Check AMS-2013 for many Oral and Poster Presentations) form 2010 onwards. Murty G Divakarla, M. Wilson, X. Xiong1, C. Tan, and C. D. Barnet, STAR Bigyani Das, Weizhong Chen, STAR, AIT Group D. Gu, X. Ma, and D. Hagan, NGAS X. Liu4 and S. Kizer4, LaRC Wael Ibrahim, Raytheon ADL Interface : Mike Wilson/Murty – Bigyani/Weizhong

  11. Updates and Improvements to CrIMSS EDR Characterization and Validation • New Focus Days (02/03/2013; 03/12/2013) and matched correlative data sets • Direct evaluation of IDPS MX-6.3 with off-line emulations was planned. • Support to IDPS operations – 03/12/2013 – Another Focus Day – CrIS High Resolution Data SDRs • Global RAOB collocations from PrepBUFR files for all the 4 focus days/PMRS RAOBs • Tuning Improvements using matched Aqua-AIRS clear cases • Aqua-AIRS matched clear cases for all the three focus days (05/15; 09/20 of 2012; 02/03/2013 – Changyi et al., AMS-2013; Murty et al., AMS, 2013) • CrIMSS Clear-flag Improvement to allow proper binning • CrIMSS Precipitation flag improvement for proper filtering of precipitation events • Ralph Ferrao and Wenze Yang  Code  ADL Testing • FG-PC regression for CrIMSS to test impact on CrIMSS EDR Algorithm • Work initiated • Dust flag for CrIMSS SDR/EDR products • Discussion on CrIMSS EDRs resilient to dust –Murty’s presentation (cal/val telecon (11/05/2012), and AMS-2013 – Murty et al) • CrIMSS EDR Algorithm - Interface with AIT Team for ADL emulation – DR Generation

  12. Additional Outputs/FTP Site • Additional parameters to Output (Sung-Yung Lee and Evan Fishbein) • Cloud liquid water content • MW surface emissivity after the MW only retrieval • Bidirectional IR surface reflectance ( for reflected solar) • And few more .. • Topography (have it in SDRs but needs to be propagated to EDRs) • STAR Data on FTP Site • IDPS 5.3 (Past), IDPS 6.3 (Present), IDPS 7.0 (Future) for 05/15/2012, 09/20/2012 (Global Evaluation) and Matched Correlative Data Sets • Data Sets on FTP Site: CrIMSS Data Bank - (Changyi.Tan@noaa.govMurty.Divakarla@noaa.gov, Xiaozhen.Xiong@noaa.gov, and Michael.Wilson@noaa.gov

  13. Backup Slides

  14. Data and Pathways for CrIMSSProvisional Maturity(Excerpt from AMS-2013 Murty’s Presentation)

  15. Notes on next slides: • DR numbers do not exist yet for most of these things. • All “files affected and LUT affected” categories are a first guess to the work required. Changes may ripple into other subroutines.

More Related