1 / 17

Implementing Marine XML for NOAA Observing Data

Implementing Marine XML for NOAA Observing Data. Nazila Merati and Eugene Burger NOAA/Pacific Marine Environmental Laboratory Seattle, WA. Ocean observation systems (OOS) and data transformations.

tory
Download Presentation

Implementing Marine XML for NOAA Observing Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementing Marine XML for NOAA Observing Data Nazila Merati and Eugene Burger NOAA/Pacific Marine Environmental Laboratory Seattle, WA

  2. Ocean observation systems (OOS)and data transformations • PMEL and its partners carry out several large scale ocean monitoring programs (TAO, FOCI, Tsunami, Argo) with hundreds of instruments and thousands of observations.

  3. Data transformations and data sharing • Scientists spend a great majority of their time formatting data, transferring data and developing transfer protocols before starting data analysis and sharing.

  4. History of the project • In 2003, we went to the ESRI User Conference and heard that XML would play an important role in the new ArcGIS architecture. • Also in 2003, NOAA IT and web developers began to discuss the need for standards when using XML. • We found that others had started looking at XML standards and defining XML standards for observational Marine data – XBTs and meteorological observations • We identified a project at PMEL and NODC that we could use to test XML – Argo Profiling Floats.

  5. Test case– Argo Profiling Float Data • The Argo program is broad-scale global ocean observing system of temperature and salinity • Floats are designed to drift at a fixed pressure for a fixed period and then move to a profiling pressure. As they rise, they collect profiles of pressure, temperature and salinity on their way to the surface. • Each float can produce up to 150 profiles during its life time

  6. Current Argo Profiling Float deployments

  7. Why use Argo Profiling Floats? • Argo Profiling Floats were selected because they are a good example of a profiling data type, have only a few attributes associated with each profile and are integral to the ocean observing system architecture • NODC’s data manager is interested in using XML as a method of data sharing between Argo Data Centers • Integration of Argo Profile data into GIS will introduce the data to a new group of users that are more GIS savvy

  8. Project objectives • To work with NODC to identify key data sets and define which file formats to work with • Using data specifications from example libraries, to define optimal parameters for a MarineXML dictionary • To build schema that works the best with Floats • To test the schema, assess bandwidth versus file size issues, and test feasibility of web transfers • To load data into ArcGIS 9.x and personal geodatabase • To test the metadata creator in ArcGIS for compatibility with MarineXML schema

  9. Why use XML? • Platform independent tool that makes data exchange and communication between organizations easier – its not just for science • Extensible • Data management - allows preservation of data, metadata, quality flags and edited data • It can serve as the “basis of a data management framework”

  10. Different types of XML • Marine XML – different flavors – strong community component, been in use since 2001 • ESML – Earth Sciences Markup Language – works with netCDF, HDF, good community backing • Borehole XML - good as it has a z-value, may be better as a descriptor of actual location rather than data – boreholes are also more shallow that the float data, not quite sure what borehole data are stored as, mostly text files • NcML – good open source standard, data already are stored as netCDF, it’s the marine standard for data storage, most translators are for gridded data • Sensor ML – still written within the OGC specs, again more for descriptor, but not necessarily for data transfer

  11. How does this fit in with GIS? • Currently, PMEL has ways of getting data into GIS that are great, but this still requires data formatting and the use of several different packages. • The output is a shapefile to be used by ESRI products.

  12. ESRI and the geodatabase • Changes are happening in the way GIS data are managed. We are moving from shapefiles and coverages to geodatabases for data management and data storage. • Geodatabases can be useful if you are building data models for a specific industry.

  13. Geodatabases and data transformations • ArcGIS 9 has the ability to take XML encapsulated data and transfer the data into the geodatabase. • The Geodatabase XML format will allow you to import and export items and data to/from the geodatabase. Transfers include domains, rules and topologies and adding behaviors

  14. Proposed solution

  15. Potential problems • netCDF to ncML or to MarineXML – no need to do a double jump • Bandwidth and compression, XML may supersize data • What if the geodatabase goes away? • Is MarineXML better than ESML? • Can it be applied to legacy data?

  16. Next steps • Look at MarineXML specifications and meet with AODC folks next week. Identify key data sets to test and talk to ESRI about netCDF translators. • Determine if the existing EPIC in-situ XML DTD and schemas can be modified to work with Argo Profiling Floats and can handle data. • Test the new XSLT tools from ESRI and test transfer times – is this just as clunky as before?

  17. Questions and suggestions?

More Related