1 / 19

CSTAR Project Presentation to the NWS

CSTAR Project Presentation to the NWS A real-time Coupled Wave/Atmospheric Regional Forecast and analysis System: CWARFS Steven Lazarus (PI), Sen Chiao (co-I), Gary Zarrillo (co-I), Michael Splitt, Kate Howard, Natalie Lamberton Florida Institute of Technology December 17, 2007.

grace
Download Presentation

CSTAR Project Presentation to the NWS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSTAR Project Presentation to the NWS A real-time Coupled Wave/Atmospheric Regional Forecast and analysis System: CWARFS Steven Lazarus (PI), Sen Chiao (co-I), Gary Zarrillo (co-I), Michael Splitt, Kate Howard, Natalie Lamberton Florida Institute of Technology December 17, 2007

  2. Proposal Status • YEAR 1 TASKS: • Task 1: Generate regional wave and atmospheric model grids for Melbourne WFO. • Task 2: Develop scripts to access and process real-time data streams (WWIII, NAM12, QuickSCAT, WSR88D) • Task 3: Begin work to port SMS pc-based interface to Linux. • Task 4: Begin ADAS assimilation work with WSR88-D and QuickSCAT winds.

  3. 31 May 2007Meeting Summary • Approximate size of the wave model domain would be 15 km offshore and 30 km along shore. • Get Linux specs from machines that are being used in NWS offices. • Ideally, the test domain would be a hybrid of the original proposed. As such it will cover a portion of both the NWS Miami and Melbourne offices GFE coastal forecast zones (from north of Titusville south to the Dade/Broward Co. line). • The test domain would consist of two nests instead of three. The outer nest would be on the order of 1 km (D2 from proposal). The high-res grid (D3) would be postponed until later in the project and might be a “floater”.

  4. PC Purchase & Configuration • Dell Precision 690 Mini-Tower Quad Core Xeon Proc X5355 2.66GHz, 2 X4MB L2 Cache 1333MHz. • 8GB, DDR2 ECC SDRAM Memory 667MHz, 4X2GB in Riser, Dell Precision 690, Factory Install (311-6351). • 160GB Serial ATA, 10K RPM Hard Drive with 16MB DataBurst Cache, Precision 380 (341-3682). • No monitor, keyboard, or speakers were purchased. The PC purchase was delayed until the end of July in anticipation of the reduction in the price of the desired quad-core systems. • The Linux box was configured in mid-August was fully configured in early September by the FIT Information Technology staff with the following software/compilers installed: • MPICH • PGI • NetCDF • GRADS • HDF

  5. Task 1: Generate wave/atmospheric model grids Atmospheric Model (WRF-EMS) WRF-EMS benchmark simulation – a 24-hour simulation of an oceanic cyclogenesis event that occurred over the Northeast US (centered over Boston, MA) during January 2005. The ARW core benchmark was designed to run over a domain consisting of 5005 horizontal grid points (ARW: 77 x 65) and 45 vertical levels at 15km grid spacing. The run time using all 8 nodes on the FIT Linux box was approximately 6 minutes faster than the best published time at the WRF-EMS site (Table 1).

  6. Benchmarks: Evaluation of the WRF-EMS ARW Core All WRF-EMS forecasts, initialized with 12 km North American Model (NAM) tile output (http://www.emc.ncep.noaa.gov/mmb/namgrids/) Run out to 3 h and then projected forward to 48 and 72 h respectively A reasonable target, on the order of 3 h of wall clock time for a 48 hour WRF-EMS simulation, was achieved for a double nest simulation with 4.5 km outer grid and 1.5 km inner grid.

  7. Evaluate a single wave sub-domain… WRF NESTED GRID 41010 41009 41113 Sebastian 41114

  8. Task 2: Script Development/Real-time data streams • Extension of high-resolution wave domain to ~25 km offshore is consistent with the availability of NOAA WaveWatch III which is available, near the coast, in our region at 4 minute resolution (~ 7 km) as of September 2007. • Output from the WWIII model will be used to drive the nearshore wave model. Testing of the single wave subdomain should begin shortly. Bottom topography data for the wave subdomains has been retrieved from the National Geophysical Data Center (NGDC) (http://www.ngdc.noaa.gov/mgg/gdas/gd_designagrid.html). WaveWatch III significant wave height (ft) for 0Z 30 October 2007. Dots indicate WWIII resolution (~ 7km).

  9. WSR88D Level II data real-time access via the Iowa State server at http://mesonet.agron.iastate.edu/data/nexrd2/raw. These raw NEXRAD data files are compatible with the Advanced Regional Prediction System (ARPS, Xue et al. 2001) Data Analysis System (ADAS, Brewster et al. 1995). Test data has been downloaded and processed to the ARPS/ADAS analysis grid (see Data Assimilation section). NWS has requested that Level III data compatibility also be included in the suite of options. NWS employee Jason Burks dumped an 8 bit radial velocity file from AWIPS – but this is not a standard format and thus was not viewable using the NCDC java viewer (the Level II data is viewable). Discussion?? Format Change issues…TECHNICAL IMPLEMENTATION NOTICE 07-95 NATIONAL WEATHER SERVICE HEADQUARTERS WASHINGTON DC 118 PM EST MON NOV 26 2007

  10. QuickSCAT Cursory analysis of data availability (1 June through 21 August 2007) for a data assimilation region from 75 W to 83 W and 24 N to 30 N (see 6 mos. report). If QuickSCAT data are to be used for the data assimilation (DA) component of this work (for our region) then the forecast analysis cycle will need to correspond to the ascending and descending node times of the satellite (near 00 and 12 UTC). DISCUSSION: TIMING ISSUE? Percent coverage within the domain varies, at these times, from near complete coverage to near zero on some days. Overall, data coverage is quite good (often greater than 50%). Data pulled from http://www.opc.ncep.noaa.gov/grids/wdata/latest/grib1. Scripts have been completed and implemented and this data is now flowing in real-time to our server.

  11. NAM 218 Tiles Retrieved from http://www.emc.ncep.noaa.gov/mmb/namgrids/. This process is automated for the WRF-EMS and will thus be shared with the data assimilation component. Tiles for both the WRF-EMS configuration (see Task I above) and for the ADAS assimilation (see Data Assimilation section) have been processed successfully for both systems. NOMADS server (CONUS) will be accessed for research mode (case studies)

  12. Task 3: Conversion of pc wave model interface to Linux • The pc–based model consists of two components: • 1.) a spectral wave model and • 2.) a two-dimensional circulation model • These models are part of the Coastal Modeling System (CMS) developed by the Coastal and Hydraulics Laboratory (CHL) at the U.S. Army Engineering and Development center (ERDC). • The following sub-tasks have been completed: • Conversion of the circulation model from pc to Linux FORTRAN • Conversion of the STWAVE model code from pc to Linux FORTRAN • Partial conversion of wave model code to Linux FORTRAN • Acquisition of the pre-processing C++ and FORTRAN codes that generate boundary input files for the wave and circulation models.

  13. Remaining Wave Model Sub-tasks (Task 3 Cont’d) • In order to complete the conversion and test the wave/circulation model the following sub-tasks remain: • Compilation of the spectral wave model code to Linux FORTRAN • Scripting of the codes to generate model input in a Linux compatible format. • These scripts will process data streams from the WWIII model to set statistical wave parameters at the model boundaries along with a spectral spread of wave energy across 35 frequency bins around the frequency carrying the peak wave energy.

  14. Task 4: Assimilation:WSR88-D and QuickSCAT winds • Successfully compiled the relevant components of the ARPS/ADAS package. • Installed and configured WRF-EMS to run on distributed memory/parallel processing on high-end Linux machine. • Benchmarked WRF-EMS (see Task 1: WRF-EMS) • Match1 assimilation grid with the WRF-EMS domain. The assimilation grid is the same size/dimensions as the WRF outer grid (4.5 km hz. resolution). This configuration is more than sufficient to take advantage of the QuickSCAT winds. • Have modified the WRF-EMS scripts to share the NAM panels (16, 17) for the DA component. 1Changed since 6 mos. report.

  15. Task 4 Cont’d… • Created code to convert QuickSCAT data from NetCDF format to ADAS-friendly LSO format. The code has been tested and the resulting data file (QuickSCAT 10m winds) ingested into ADAS successfully. • Successfully mapped WSR88D radial winds to the ADAS domain. • Created QuickSCAT error table for ADAS. This error table will likely be adjusted once we begin the data assimilation experiments. • We are currently working on mapping the ARPS DA grid to the WRF-EMS grid. The ARPS/ADAS package has ARPS to WRF conversion software (ARPS2WRF) which has been compiled and is currently being tested at the time of this report. Once this step is completed, the atmospheric component will be ready for preliminary testing.

  16. Issues ADAS will use the same terrain as the WRF-EMS

  17. WRF Input ADAS/ARPS 5.2.8 compatible with WRF V2.2 not 2.1 Which winds are the correct winds? Grid vs. Earth ARPS2WRF wrfinput_d01 WRF wrfinput_d01 10 m winds NAM GREL NAM EREL

  18. Interpolation perturbation potential temperature q 32 levels NAM  WRF 45 levels s = 0.7845 NAM  ADAS  WRF Originally wanted coarse resolution for ADAS vertical grid (~19 levels!)

  19. NAM-218 1 2 WSR88D ADAS WRF-EMS ARPS2WRF QuickSCAT 0 3 6 48 WWIII WAVE MODEL

More Related