310 likes | 484 Views
Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee Operational Processing Centers 13 May, 2009 . CAPT John Kusters Commanding Officer Mike Clancy Technical Director. Fleet Numerical….
E N D
Fleet Numerical Meteorology & Oceanography CenterCommand Overview – Presented toCommittee Operational Processing Centers13 May, 2009 CAPT John Kusters Commanding Officer Mike Clancy Technical Director Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority…
Outline • Introduction • Overview of Operations • Overview of POPS and A2 • Cross Domain Solutions • NOGAPS Way Ahead • MILCON Project • Summary
Operational Excellence The United States Navy is committed to excellence in operational meteorology and oceanography (METOC) Fleet Safety Decision Superiority Information Assurance We are stakeholders and partners in National capability Key part of the Nation’s ocean & weather infrastructure Expect to be a major player in NUOPC Near-term goals (CNMOC) Tighter link between forecasts and decisions Predictive Oceanography We are the Navy’s Operational Science Community 3
FNMOC Mission • We produce and deliver weather, ocean, and climate information for Fleet Safety, Warfighting Effectiveness, and National Defense. • Numerical Weather Prediction (NWP) is the core of our business. • Global and Regional Operational Models • Assimilate meteorological and oceanographic data worldwide • ~6 million observations daily • Scheduled and on-demand products • 0 to 240 hour forecasts, updated at 6-hour interval • Specific to Fleet and Joint operations • 24x7 Operational Reachback Center • Supporting National, Navy, DoD/Joint, and Coalition missions • Direct Support for Global Submarine Weather • Data fusion for planning and operations
Focus on Strategy Mission We provide and deliver weather, ocean and climate information for Fleet Safety, Warfighting Effectiveness, and National Defense. Vision To be the First Choice for environmental prediction and assessment in direct support of National Defense Values People Teamwork Communications Innovation Technical Excellence Basic Strategy Best-Cost Provider
Recent Accomplishments Commenced FNMOC models Ops Run on Linux Cluster (operational since October 2008), and retired legacy SGI supercomputers Commenced operational support for the North American Ensemble Forecast System (NAEFS), with NOGAPS Ensemble fields delivered daily to NCEP Upgraded the NOGAPS Ensemble with Ensemble Transform (ET) initialization Implemented the coupled air-sea version of the GFDN tropical cyclone model Commenced Submarine Enroute Weather Forecast (SUBWEAX) mission at SECRET and TS/SCI levels Hosted and upgraded the Naval Oceanography Portal (NOP), the single unclassified and classified operational Web presence for all CNMOC activities Delivered the first spiral of the Automated Optimum Track Ship Routing System (AOTSR) Completed A76 source selection process for IT Services resulting in selection of the government’s Most Efficient Organization (MEO) bid as the winning proposal; implemented IT Services Department (ITSD) on February 1, 2009
CNMOC Oceanographic Operations Watch (COOW) Global and 24x7 Administrative Reporting Coordinates and provides support for all time sensitive METOC RFIs Fleet Ops Reachback Oceanography Portal Fleet Products: AREPS, TAWS, Staff Briefs, Exercise Support Tailored Support: STRATCOM, NOPF Whidbey, NSW MSC Supporting all forward deployed METOC assets ISR Reachback TAM JFCC ISR Submarine Weather Direct Support Operational Watch Presence
Models Fleet Numerical operates a highly integrated and cohesive suite of global, regional and local state-of-the-art weather and ocean models: Navy Operational Global Atmospheric Prediction System (NOGAPS) Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) Navy Atmospheric Variational Data Assimilation System (NAVDAS) Navy Aerosol Analysis and Prediction System (NAAPS) GFDN Tropical Cyclone Model WaveWatch 3 (WW3) Ocean Wave Model Navy Coupled Ocean Data Assimilation (NCODA) System Ensemble Forecast System (EFS) l~30 km Obs ~28 km Mountain Wave 15 km 60 m s-1 40 m s-1 10 km 5 km 20 m s-1 Sierra Owens Valley 0 m s-1 0 km Surface Pressure and CloudsPredicted by NOGAPS Cross section of temperature and wind speeds fromCOAMPS showing mountain waves over the Sierras
Satellite Products SATFOCUS SSMI and SSMI/S Scatterometer Tropical Cyclone Web Page Target Area METOC (TAM) Tactically Enhanced Satellite Imagery (TESI) SSM/I Water Vapor SSM/I Wind Speed SATFOCUS Dust Enhancement Product Example SATFOCUS Products
Models and Applications Ocean Acoustic Forecasting Aircraft Routing Automated High Seas / Wind Warnings Visibility/Dust Forecasts Aerosol Modeling Optimum Track Ship Routing Electro-Optical Forecasts NWP(NOGAPS, COAMPS) Search and Rescue Target Weapon Systems Ice Forecasts CEEMS Ensemble Models Tropical Cyclone Forecasts Long-Range Planning WRIP Ballistic Wind Computations
Program of Record (ACAT IAD Program in sustainment) Provides HPC platforms to run models and applications at the UNCLAS, SECRET and TS/SCI Levels Has been traditionally composed of two Subsystems: Analysis and Modeling Subsystem (AMS) Models (NOGAPS, COAMPS, WW3, etc.) Data Assimilation (NAVDAS, NAVDAS-AR) Applications Transactions and Observations Subsystem (ATOS) Model pre- and post-processing (e.g., observational data prep/QC, model output formatting, model visualization and verification, etc.) Full range of satellite processing (e.g., SATFOCUS products, TCWeb Page, DMSP, TAM, etc.) Full range of METOC applications and services (e.g., OPARS, WebSAR, CAGIPS, AREPS, TAWS, APS, ATCF, AOTSR, etc.) Hosting of NOP Portal (single Web Presence for Naval Oceanography) Primary OceanographicPrediction System (POPS)
New POPS Architecture Strategy Drivers: Knowledge Centric CONOPS, requiring highly efficient reach-back operations, including low-latency on-demand modeling Growing HPC dominance of Linux clusters based on commodity processors and commodity interconnects Resource ($) Efficiencies Solution: Combine AMS and ATOS functionality into a single system Make the system vendor neutral Use Linux-cluster technology based on commodity hardware Advantages: Efficiency for reach-back operations and on-demand modeling Capability to surge capacity back-and-forth among the AMS (BonD Tier 1) and ATOS (BonD Tiers 0, 2, 3) applications as needed Cost effectiveness of Linux and commodity hardware vice proprietary operating systems and hardware Cost savings by converging to a single operating system [Linux] We call the combined AMS and ATOS functionality A2(AMS + ATOS = A2)
A2 Hardware Specifications A2-0 (Opal) Linux Cluster 232 nodes with 2 Intel Xeon 5100-Series “Woodcrest” dual-core processors per node (928 core processors total) 1.86 TB memory (8 GB per node) 10 TFLOPS peak speed 35 TB disk space with 2 GB per second throughput Double Data Rate Infiniband interconnect at 20 Gb per second 40 Gb per second connection to external Cisco network 4 Clearspeed floating point accelerator cards A2-1 (Ruby) Linux Cluster 146 nodes with 2 Intel Xeon 5300-Series “Clovertown” quad-core processors per node (1168 core processors total) 1.17 TB memory (8 GB per node) 12 TFLOPS peak speed 40 TB disk space with 2 GB per second throughput Double Data Rate Infiniband interconnect at 20 Gb per second 40 Gb per second connection to external Cisco network 2 GRU floating point accelerator cards
A2 Middleware • Beyond the hardware components, A2 depends on a complicated infrastructure of supporting systems software (i.e., “middleware”): • SUSE Enterprise Server Linux Operating System, soon to transition to Red Hat Linux V5 • GPFS: the Global Parallel File System, commercial software that provides a file system across all of the nodes of the system • ISIS: Fleet Numerical’s in-house database designed to rapidly ingest and assimilate NWP model output files and external observations • PBSPro: job scheduling software • ClusterWorx: System management software • Various MPI debugging tools, libraries, performance profiling tools, and utilities • TotalView: FORTRAN/MPI debugging tool • PGI’s Cluster Development kit, which includes parallel FORTRAN, C, C++ compilers • Intel FORTRAN Compiler • VMware ESX server
FNMOC HPC Systems Systems are linked directly to ~230 TB of disk space and ~160 TB of tape archive space. FS = File Server / Legacy Cross Domain Solution ATOS = Applications, Transactions, and Observations Subsystem CAAPS = Centralized Atmospheric Analysis and Prediction System A2 = Combined AMS / ATOS System
Cross Domain Solutions • In order to perform its mission, FNMOC fundamentally requires a robust, high-bandwidth Cross Domain Solution (CDS) system to allow two-way flow of data between the UNCLAS and SECRET enclaves within its HPC environment • FNMOC’s existing CDS system is nearing the end of it’s lifecycle • SGI computer hardware (FS1, FS2) • Trusted IRIX (TRIX) multi-level secure operating system • The DoD Unified Cross Domain Management Office (UCDMO) has recently indicated that all legacy CDS systems must be replaced by one meeting their approval. • Existing UCDMO solutions do not meet FNMOC’s bandwidth requirements (currently ~1 GB/sec, increasing to ~64 GB/sec by 2013)
CDS Way Ahead • FNMOC is currently partnered with the National Security Agency (NSA) to achieve an acceptable CDS that meets high-bandwidth requirements • Seeking interim solution based on adaptation of one of the existing 14 UCDMO approved solutions, allowing phased migration to long-term solution • Seeking long-term solution based on the NSA High Assurance Platform (HAP) project • Expect that resulting solution will also meet NAVO’s requirements • Joint NSA/FNMOC presentation at the Oct 2008 DISN Security Accreditation Working Group (DSAWG) resulted in approval of proposed CDS way ahead described above
UKMO Engagement CNMOC/FNMOC and NRL explored the possibility of a partnership agreement with UKMO to obtain the UKMO Unified Model (UM) as a new baseline for development of the Navy’s operational global NWP capability. Exploratory discussions took place between CNMOC (RDML Titley, Steve Payne), NRL (Simon Chang) and UKMO senior leadership in Exeter Sept 2009 NRL obtain Research License for the UM code and performed initial testing on FNMOC HPC platforms in late 2008/early 2009. Issues: UM throughput performance is very poor on the FNMOC Linux Cluster architecture Navy IA policies created significant impediments for working with the UM code Business and Licensing issues are a concern NRL will test the UM on the IBM architecture at NAVO/MSRC Remote operations of a UM-based ensemble at NAVO/MSRC may be a possibility
Current Thinking on Navy Global NWP Fully support NUOPC Continue operational transition of most-promising near-term NOGAPS upgrades: 4DVAR (NAVDAS-AR) Semi-Lagrangian Continue exploring the possibility of running operational UKMO/UM based ensemble (NAVO/MSRC) Seek partnerships for development of a next-generation global NWP model for operational implementation at FNMOC ~ 2016-2018 Link to growing Navy interest in: (1) the arctic, (2) climate change, and (3) energy conservation
Status and Milestones for 4DVAR (NAVDAS-AR) NOGAPS/NAVDAS-AR top raised from 1.0 hPa to 0.04 hPa with NOGAPS T239L42 implementation Began assimilating AIRS, IASI and SSMI/S radiances Produced the following improvements relative to the operational NOGAPS/NAVDAS (3DVAR) implementation: ~6 hours improvement in 500 mb height anomaly correlation at TAU120 ~50 nm improvement in TC track performance at TAU120 (based on Aug/Sep 2008 cases) Final NAVDAS-AR Beta Testing began 16 Mar 2009 and will complete in Jul 2009 Expect operational implementation in Aug 2009
Impact of NAVDAS-AR on NOGAPS TC Track Skill 01 August 2008 – 17 September 2008 TC Track Error (nm) Number of Forecasts 150 116 86 68 50 23
MILCON Project $9.3M 14351 sq ft expansion/renovation of FNMOC Building 700 (Computer Center Building) Adds 2500 sq ft to Computer Center for NPOESS IDPS Expands Ops Floor from 1550 to 3200 sq ft Expands SCIF Ops Floor from 1947 to 3012 sq ft Expands SCIF Computer Center from 875 to 2240 sq ft Expands Auditorium from 47 seats to 100 seats Schedule Design/Build Contract awarded Sep 2008 Groundbreaking May 27, 2009 Completion approximately Oct 2010
FNMOC MILCON Project FNMOC Building 700 Computer Center
Summary Several of our recent accomplishments of COPC interest: Achieved IOC for the A2 Linux Cluster, new host for the FNMOC models Ops Run Commenced operational support for NAEFS, with NOGAPS Ensemble fields delivered daily to NCEP Upgraded the NOGAPS Ensemble with ET initialization Implemented the coupled air-sea version of the GFDN TC model We have embraced Linux-based HPC as we continue to build out our A2 cluster to accommodate the full range of our operational workload We are partnered with NSA to achieve a DoD approved Cross Domain Solution capable of meeting our operational throughput requirements Along with ONR, NRL, CNMOC, and OPNAV N84 we are exploring the way-ahead for Navy Numerical Weather Prediction We will break ground on $9.3M MILCON project in June to expand and renovate Building 700 Computer Center
Top Five FY09 Objectives Implement the A76 Most Efficient Organization (MEO) as the Information Technology Services Department (ITSD). Achieve Full Operational Capability (FOC) for the A2-0 (Opal), A2-1 (Ruby) and A2-2 (Topaz) Linux Cluster systems. Achieve and maintain required Information Assurance (IA) accreditations, and implement DoD-approved Cross Domain Solution (CDS) for transferring data between classification levels. Design and install satellite processing systems in preparation for launch of NPOESS and NPP. Increase skill of METOC model products through implementation of new models, upgrades to existing models, and the assimilation of additional observational data from new sources and sensors.
Additional FY09 Objectives Execute the Building 700 MILCON Project while maintaining at least 98.5% uptime for operations. Maintain excellence in Submarine Enroute Weather Forecast (SUBWEAX) support. Achieve IOC for the Automated Optimum Track Ship Routing (AOTSR) system. Field and support next generation version of the Naval Oceanography Portal (NOP). Meet all requirements for the Weather Reentry-Body Interaction Planner (WRIP) project. Meet all requirements for the North American Ensemble Forecast System (NAEFS) project in preparation for full engagement in the National Unified Operational Prediction Capability (NUOPC) initiative. Complete all Cost of War (COW) projects. Maintain excellence in climatology support. Implement Resource Protection Component of the new Weather Delivery Model. Develop a robust Continuity of Operations (COOP) plan for key capabilities. Complete all Primary Oceanographic Prediction System (POPS) Program Decision Board (PDB) actions. Achieve Project Management culture aligned to the POPS program.