1 / 8

Quality Control for RADAR reflectivity

Quality Control for RADAR reflectivity. Short report by Mari án Jurašek. Challenges. Standard screening decision procedure reject too many observation and destroy the reflectivity profiles Where to put decision about reflectivity rejection?. Too strong rejection.

neviah
Download Presentation

Quality Control for RADAR reflectivity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Control for RADAR reflectivity Short report by Marián Jurašek

  2. Challenges • Standard screening decision procedure reject too many observation and destroy the reflectivity profiles • Where to put decision about reflectivity rejection?

  3. Too strong rejection • 1st possible solution is to open screening rules for reflectivity • too dangerous • Need of more selective approach • reflectivity profile consistency check

  4. Reflectivity profile consistency check • algorithm: • if the single reflectivity didn’t pass classic rule check upper value in reflectivity profile • if we found reflectivity, which will pass standard rules and is not more far than Zconstant2 from original reflectivity and all reflectivity in profile between original point are bigger than 0 and they are not far than Zconstant2 of original value, we open screening rule by factor Zconstant1. • if the check reflectivity will pass the new opened rule is considered as good, will be definitely reject.

  5. Reflectivity profile consistency check (2) • Remark: This check will not affect the rejection rule taking into account dynamic quality flag provided by “radar” people. • Results: After applying this new check we let pass around 5-10% more observation the screening decision. • REMARK: All limits and constants should be object of investigation, when real assimilation will be ready, considering the impact on analysis.

  6. Where put rejection? • We need two screenings in one: • for reflectivity itself • for the q, T profiles which will enter to minimization • Bad reflectivity should not enter to the baysienne inversion (creation of q, T profile)

  7. Where put rejection? (2) • Solution (proposal): • Move REFLSIM from HOP to the HRETR before calling the Baysienne inversion, to let in HOP just part for q and T. • Move all checks for reflectivity from FGCHK and DECIS to the new subroutine REFLCHECK after calling REFLSIM and baysienne inversion, and let in FGCHK and DECIS just for q and T

  8. Other results • First successful attempt to use more radars in screening: • 4 radars: Arcis, Bordeaux, Falaise and Trappes • “Nice” outcome: • bug found in BATOR => problem solved • memory limits reached for such amount of data => thinning should be applied before processing data to the ODB (11 radars available online on cougar with more elevations)

More Related