1 / 45

Consortium Meeting February 4, 2010

Consortium Meeting February 4, 2010. Who is using our model output?. Hits. Canadians. Utah. File Transfers. Utah!!!. New Hardware.

candra
Download Presentation

Consortium Meeting February 4, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Consortium MeetingFebruary 4, 2010

  2. Who is using our model output?

  3. Hits Canadians Utah

  4. File Transfers Utah!!!

  5. New Hardware • We have acquired 8 nodes of the new Intel Newhalem processors (last week)..64 processors. They have the memory bandwidth that will allow much more effective use of resources • These will be used to make high-resolution local data assimilation a reality for us (4-km), making use of as much of our local observations as possible. • Exhausting enclosure for the last cluster. Hot air all heading outside. • Replaced an old (failing) RAID disk storage array. • System has been very, very stable

  6. Major Changes and Improvements • Beginning with the 2009123100 run, the WRF-GFS has switched to the latest version (3.1.1.) • Bug fixes in several of parameterizations we use. • Thompson microphysical scheme is now double moment in rain as well as ice (number concentration as well as mixing ratio calculated). This may help with a major issue…our light rain starts too late. • Lot more pbl options • Gravity wave drag parameterization. • More stable.

  7. But is it better?

  8. High Resolution 1.33 km Nest • Western WA only • Once a day (0000 UTC cycle) to 36 h • Uses the gap period after we finish all the real-time work on SAGE. • Attempt to answer questions: • What is the payoff in getting the land-water boundaries and smaller scale terrain much better (4-km hardly has Puget Sound!) • Does ultra high resolution improve objective verification or subjective structures? • Can it provide a better feed to the NWS for use in their GFE system (which is going to 2.5 km spacing)?

  9. 4 km 1.3

  10. 6-hr forecast, 10m wind speed and direction 4 km 1.3 km

  11. Precip Verification

  12. Wind Direction

  13. Future Evaluation • Improving PBL and surface drag may preferentially help 1.3 km (more later) • Using 1.3 km as a testbed (some problems are more acute at higher resolution)

  14. A Major Issue Has Been Excessive Wind Speeds Over Land and Excessive Geostrophy at the Surface –either too much mixing in vertical or not enough drag. Winds over land and water too similar • No magic bullet in PBL tests. • Recently, we tried something that really looks like it has potential to help…increasing the friction velocity….ustar. • Essentially adds drag, without messing other things up. • Perhaps it is realistic, mimicking the effects of hills.

  15. During the past few months we have continued our testing program of various PBL schemes, vertical diffusion options, etc. • A test case has been one in which the 4 and 1.3 km created unrealistic roll circulations.

  16. http://www.atmos.washington.edu/~ovens/wrf_1.33km_striations/http://www.atmos.washington.edu/~ovens/wrf_1.33km_striations/

  17. 1 km visible

  18. Problem • Instead of getting open cellular convection, there are these period cloud streets. • Look like roll circulation, but of too large a scale (if you look at sat pics you can see hints of them). • Sometimes apparent (but less so) in 4-km. • Occurs only in unstable, post-frontal conditions.

  19. Through the kitchen sink at it and consulted heavily with Dave Stauffer at Penn State • Tried a range of PBL schemes (YSU, QNSE, ACM2, MYNN, MYJ, MYJ with Stauffer mods) • Added 6th order diffusion and played with diffusion coefficent. • Fully, interactive nesting • Upper level diffusion and gravity wave drag • Monotonic advection • Varying vertical diffusion, both more and less

  20. Results • ACM2 (Pleim PBL and LSM) was the only thing that helped reduce the rolls. • It created this stratiform cloud mass that wasn’t very realistic. • We asked NCAR to help—nothing yet. • Did any of these tests suggest fixes for some of our recurring problems?

  21. Example: Cut vertical diffusion 10 1/8th of normal value

  22. Vertical Diffusion cut to 1/4

  23. Standard Low Diffusion

  24. Ensemble Kalman Filter • During the fall, we have tested the EnKF data assimilation system at 4 km resolution with a three hour update cycle. • Results are promising, but it became clear we lacked the computational power to do this consistently and dependably (loss of a single node and we ran behind). • We also lacked the resources to go to a one-hour cycle, which people want.

  25. Some Initial Results • Bottom Line: We can create a better analysis • But it doesn’t hold long…we relax back to the current 4-km skill within a few hours. • We may be able to do better than this…more later…but there is a real limitation without having much upstream mesoscale data. • The coastal radar might alter this situation.

  26. Plans During the Month • With new hardware, go to a dependable 3-hr assimilation cycle at 4-km. • Switch to the NCAR DART system, not our homegrown EnKF infrastructure. • DART has many options that we don’t have that may improve the assimilation (vertical localization, satellite radiance and radar assets assimilated) • Experiment with 1hr updates

  27. EnKF • The target is to have by June a dependable operational system. • High-quality 3D analyses at 4-km will be available and archived. • Short-term forecasts available for other uses.

More Related