1 / 29

HDF, EOSDIS, NASA ESE Data Standards

Explore the latest updates on ESDIS status, HDF maintenance, HDF-EOS integration, lessons learned, and the 2004 EOSDIS Satisfaction Survey results.

adrianas
Download Presentation

HDF, EOSDIS, NASA ESE Data Standards

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HDF, EOSDIS, NASA ESE Data Standards Richard Ullman

  2. Agenda • ESDIS Status wrt HDF • EOSDIS (American Customer Satisfaction Index) • NASA Earth Science Standards Endorsement Process

  3. ESDIS Status • Launch of Aura (July 25) marks end of development phase of the EOSDIS Core System (ECS). • System is now in maintenance. Capability refinements are under the “Synergy” program. • Data enters are now running “Synergy 3” release. Will be transitioning to “Synergy 4” over the next six months. • Maintenance of HDF for EOS includes two components • Support of NCSA’s HDF group through a cooperative agreement. • Support of HDF-EOS through ECS maintenance contract • Other ESDIS project sponsored HDF-related work will be phased out near the end of calendar year 2004. • http://hdfeos.gsfc.nasa.gov website updates • “SESDA” hdf data usability task • Coordination, outreach and test bed development for HDF integration through CEOS, OGC, ISO organizations.

  4. HDF-EOS • A profile, convention, convenience API, etc for NASA’s Earth Observation System standard data products. • Defines structures for Point, Swath, Grid (Atmospheric Profile, Zonal Table) • Defines specific location for product metadata • ODL encoded metadata compliant with FGDC content standards. • Maintained by a by L3-Communications under subcontract to Raytheon’s ECS Maintenance and Development contract. • Next release expected Dec. 2004 • HDF5-1.6.3 • SZIP 1.2 • New inquiry functions • CEA (Cylindrical Equal Area grid projection • Improved performance in read/write functions 

  5. HDF in NASA Earth Remote Sensing • HDF-EOS is format for EOS Standard Products • Landsat 7 (ETM+) • Terra (CERES, MISR, MODIS, ASTER, MOPITT) • Meteor-3M (SAGE III) • Aqua (AIRS, AMSU-A, AMSR-E, CERES, MODIS) • Aura(MLS, TES, HIRDLS, OMI • HDF is used by other EOS missions • OrbView 2 (SeaWIFS) • TRMM (CERES, VIRS, TMI, PR) • Quickscat (SeaWinds) • EO-1 (Hyperion, ALI) • ICESat (GLAS) • Calypso • Over 3 petabytes of EOSDIS archived data

  6. HDF-EOS Lessons • Definition of a set of data structures as a profile is not sufficient to guarantee interoperability. • Also need definition of content, especially metadata - this is increasingly difficult the wider the disciplines covered. • See AURA DSWG standards and NetCDF CF as examples. • Also need conformance measures - no spec is so clear that it cannot be misinterpreted. • Even during life of mission, there must be allowance for technology refresh. • Technology advances affect user expectations. • Well understood concept for hardware - traditionally less recognized for science software and data products. • See OAIS

  7. Discussion topics today • Ask the experts • A growing number of software products depend upon the HDF libraries. Are there suggestions for how to better coordinate HDF library releases. • Questions from participants. • HDF-GEO? • Last workshop there was strong opinion expressed that there should be some kind of bridge among HDF geographic and geophysical profiles. • Can we develop a better sense of what such and “HDF-GEO” might be? • Is this the list? HDF-EOS, NetCDF API, HDF-NPOESS • What are reasonable expectations for this effort?

  8. From ESDSWG meeting last week: Why Use a Standard? • Good documentation • Other projects have reviewed it and found it useful • Reusable software sometimes available • Potential users can see that standard and software works • Not management pressure or peer pressure – just more practical

  9. 2004 EOSDIS Satisfaction Survey

  10. 2004 EOSDIS Satisfaction Survey • A measure of customer satisfaction • ESISS and ESSAAC have recommended that NASA focus on measuring the “impact” of our systems and services rather than just the “output” • In 2004, NASA used a comprehensive survey to determine the American Customer Satisfaction Index (ACSI) for EOSDIS products and services. • ACSI provides a normalized measure of customer satisfaction that allows benchmarking against similar companies and industries. • 2004 survey results show that customer satisfaction with EOSDIS compares very favorably with both industry and other government agencies.

  11. Snapshot of the American Customer Satisfaction Index (ACSI) • The # 1 national indicator of customer satisfaction today • Compiled by the National Quality Research Institute at the University of Michigan using methodology licensed from the Claes Fornell International (CFI) Group • Measures 40 industries and 200 organizations covering 75% of the U.S. Economy • Over 70 U.S. Federal Government agencies have used ACSI to measure more than 120 programs/services • CFI’s Advanced methodology quantifiably measures and links satisfaction levels to performance and prioritizes actions for improvement

  12. Survey Background • EOSDIS survey was performed by CFI Group through a contract with the Federal Consulting Group (Department of Treasury). • Survey questions developed by the DAAC User Services Working Group were tailored to fit the CFI methodology • ESDIS provided the CFI Group with 33,251 email addresses from users who had used NASA/EOSDIS products • CFI sent invitations to participate in an online survey to 9,999 randomly selected users • 1,056 responses were completed • 1,016 surveys were used in the analysis (250 responses were needed for statistically meaningful response).

  13. EOSDIS Results • The Customer Satisfaction Index for NASA EOSDIS is… • The Customer Satisfaction Index score is derived from customer responses to three questions in the survey: • How satisfied are you overall with the products and services provided by the Data Center (79)? • To what extent have the data, products and services provided by the Data Center fallen short of or exceeded your expectations (73)? • How well does the Data Center compare with an ideal provider of scientific data, products and services (71)? • This score is four points higher than the 2003 American Customer Satisfaction Index for the Federal Government overall (71). * The confidence interval for ACSI is +/-1.1 for the aggregate at the 95% confidence level. 75* NASA EOSDIS Aggregate Segment

  14. Score ComparisonCurrent Location 74 ACSI 76 88 Customer Support 82 85 Delivery 83 USA (n=478) 72 Product Selection Outside and Order the USA (n=577) 73 69 Product Search 71 67 Product Quality 69 34% Complaints 31%

  15. Customer Support - Score 84, Impact: 1.0 Customer Support 84 CFI considers EOSDIS to be “World Class” in the area of customer support. Professionalism 87 Technical knowledge 85 Accuracy of information provided 85 Helpfulness in selecting/finding data or 84 products Helpfulness in correcting a problem 83 Timeliness of response 82

  16. In what format were data or products provided? HDF-EOS 49% HDF 39% NetCDF 5% Binary 14% ASCII 12% GeoTIFF 19% Other 7% Product Quality 68 Ease of using the data product in the 69 delivered format Clarity of data product 67 Was documentation… Delivered with the data 44% Pointed to (on a website) 41% Not available 15% documentation Thoroughness of data product 68 documentation Product Quality - Score 68, Impact: 0.9

  17. Analysis of Results • Product quality is the lowest scoring component (68), and has a relatively high impact (0.9). • All attributes in this area received similar ratings • At 84 customer support scores well, and is also high impact (1.0). • There is a significant difference in customer support ratings given by customers within the U.S. (88) compared to those outside the U.S. (82). • The components product search, product selection and order are highly correlated. • Recent customers are more satisfied, but are also reporting more problems. • Percentage of customer complaints is fairly high (32%) when compared to the federal government overall (12%). • Customers may not be calling to complain about a problem, but rather to seek assistance in solving the problem. • 90% of respondents who answered the customer complaint questions gave user services’ complaint handling a rating of “6” or above.

  18. CFI’s Recommendations for Improving ACSI • Focus on Product Quality: • Review the type of data product documentation available with each product. Work to improve the clarity and thoroughness of the documentation. • Assess the various data formats and work to improve the usability of each. • Offer a wider variety of data formats. • Review the Product Search and Product Selection and Order scores to determine how best to help customers find the data they need: • Due to high correlation, improvements in one area will likely result in improvements in the other. • Simplify the search process; make data products more apparent. • Improve data product descriptions.

  19. Product Format Ease of Use Comparison

  20. NASA’s Earth Science Data SystemsStandards Process

  21. Insights • Interoperability does not require homogeneous systems, but rather coordination at the interfaces. • Management can judge success based upon program goals rather than dictate solutions. • example: degree of interoperability rather than use of particular data format. • Communities of practice have solutions. • Published practices that demonstrate benefit can grow … • successful practice in specific community • broader community adoption • community-recognized “standards”

  22. The ESDSWG Standards Process • Modeled on Internet Engineering Task Force “RFC” process and tailored to meet NASA’s circumstances. The standards process provides: • Registers community practice for NASA • NASA Earth science data management can rely on standards to achieve highest priority interoperability • Encourages consensus within communities • Science investigators are assured that standards contribute to science success in their discipline. • Grows use of common practices among related activities • Discipline communities benefit from the expertise gained by others • Documents data systems practices for use by external communities. • Lowers barriers to entry and use of NASA data.

  23. Standards Process Group Strategy • Adopt standards at the interfaces, appropriate to given science and drawn from successful practice. • Find specifications with a potentially wide appeal • Draw attention to a much broader audience • Monitor use, promote what works well • Result : Accelerate the evolution and adoption • Preferred source of RFC is community nomination. • Possible to direct creation of RFC in response to identified needs. • Consequence of endorsement • Future NASA data systems component proposals will be judged partly on how well they interoperate using community-identified practices or else justify why departure from community has greater benefit.

  24. Initial Screening • Initial review of the RFC • Provide RFC submission support • Form TWG; set schedule Proposed STD Draft STD STD RFC Community Community Community Community Core Core Core Core • Review of Implementation • Community review and input • Evaluation and recommendation • Review of Operation • Community review and input • Evaluation and recommendation Three Step Standards Process

  25. SPG Review and Recommendation Stakeholders Evaluate Implementations TWG Evaluate Implementations and Community Response SPG Recommendation SPG Review

  26. What’s in the works • DAP 2 standard – used by many in the oceanographic community – basis for the DODS and OpenDAP servers. -- submitted in June as a “Community Standard” • “Request For Comments” on implementation experience distributed October 1, comments due November 12. • Precipitation Community – discussing potential science content standards being used to define level 2 & level 3 data • Self identified group of precipitation scientists have identified need and are proposing a draft. Are discussing at IPWG in Monterey. • “The community is establishing de facto standards in this area and that is the best way to deal with this.” • FGDC Vegetation Index standard – discussing with potential community members

  27. Ideas from the last ES-DSWG • GCMD DIF • GeoTIFF • NetCDF CF • OGC suite

  28. Community Leadership • Strong proposals will have: • Leadership to support and use standard • Potential for impact • Potential for approval • Simple standard is better • Potential for spillover to other communities • Successful RFCs will have: • At least two implementers • Demonstrated operational benefit • Leadership in generating the RFC • Community willing/able to review

  29. SPG Contacts • Earth Science Data Systems Standards Process Group • http://spg.gsfc.nasa.gov/spg • Chairs SPG • Richard Ullman richard.ullman@nasa.gov • Ming-Hsiang Tsou mtsou@mail.sdsu.edu

More Related