160 likes | 353 Views
GFE Efforts in TAFB NWS Collaboration and Gridded Services Meeting October 2009. Christopher Juckins Meteorologist/Programmer Technical Support Branch, NHC. Outline. Importance of gridded marine products Current Configuration (technical discussion) Lessons Learned/Best Practices
E N D
GFE Efforts in TAFB NWS Collaboration and Gridded Services Meeting October 2009 Christopher Juckins Meteorologist/Programmer Technical Support Branch, NHC
Outline • Importance of gridded marine products • Current Configuration (technical discussion) • Lessons Learned/Best Practices • Some Questions and Unknowns • Other Discussion
Marine Weather Portal forecast.weather.gov/mwp
TAFB Configuration • Domains set for 12.5km resolution • Atlantic (nhcw) – 169,264 points • Pacific (nhcr) – 327,132 points • Background shapefiles updated with Ira Graffman and Shannon White • Developed a worksheet to help calculate the desired grid resolution and lat/lon anchor points
TAFB Domains Wrote a document describing the process- Determine # of grid points based on desired resolution- Configure the map projection for GFE display
TAFB Domains In the process, created a World Mercator Scale for D2D
TAFB Domains • Needed to tweak domain a few times due to legacy GEMPAK graphics boundaries • Intersite Coordination (ISC) with WFOs was significant challenge • Shannon White assisted with edits to localConfig.py at both NHC and at WFO Miami to test exchange of grids • Still a few shortcomings with ISC that need to be addressed with GSD (update default configs, bounding box lat/lon calculations) • ISC will be important before final step where grids are sent to NDFD
Model Data Ingest • Current generation AWIPS has challenges providing high resolution global model datasets • TAFB and OPC AORs extend well beyond AWIPS/SBN 40km GFS boundaries; smartTool can stitch multiple model sections together • NHC and OPC are ingesting 0.5 degree (~55km) global GFS dataset (3-hourly, out to 180 hrs) via FTP • NCO has offered a custom 35km GFS dataset but AWIPS1 data ingest not easy to configure • Work around established for AWIPS1 grib2 wind decoder (Jay Smith, Fairbanks)
Send GFE Grids to NAWIPS Local App converts and sends GFE grids to NAWIPS
Many Possibilities Exist Hazard Grid for Winds (colors) and Seas (hatches)Could be a briefing tool, web page display, etc.
Best Practices • Need a forecaster to act as GFE focal point (text formatters, GFE display program) • Technical staff should explain suggested programming tips (make backup copies, commenting code, etc) • Give user “ifps” its own password (no need for root) • Provide customized bash shell to make life easier • Demonstrate correct way to stop/start IFPS server • Utilize practice mode when testing/making changes • Keep python reference books on site • Join the ifps mailing list • Develop and maintain training plan
Some Questions • How do we produce GFE grids during scheduled AWIPS upgrades? Service Backup versus on-site spare server with custom GFE installation? • Explore 64-bit for better performance? • Redhat 5.2 allows Physical Address Extension (PAE) of up to 16GB of RAM. Is this a possible solution with AWIPS 2? • Cannot add lat/lon lines with progressive disclosure in current baseline GFE display. IC4D solution? • When do we engage NDFD folks for transmitting grids, even experimentally?
Comments or Questions Christopher Juckins Technical Support Branch Meteorologist NOAA’s National Weather Service National Hurricane Center Miami, Florida