240 likes | 378 Views
Discussion on an HDF-GEO concept. HDF Workshop X 30 November 2006. Abstract. At the past several HDF & HDF-EOS Workshops, there has been some informal discussion of building on the success of HDF-EOS to design a new profile, tentatively called HDF-GEO.
E N D
Discussion on an HDF-GEO concept HDF Workshop X 30 November 2006
Abstract • At the past several HDF & HDF-EOS Workshops, there has been some informal discussion of building on the success of HDF-EOS to design a new profile, tentatively called HDF-GEO. • This profile would incorporate lessons learned from Earth science, Earth applications, and Earth model data systems. • It would encompass all types of data, data descriptions, and other metadata. It might support 1-, 2-, and 3-D spatial data as well as time series; and it would include raw, calibrated, and analyzed data sets. • It would support data exchange by building its needed complexity on top of minimal specialized features; and by providing clear mechanisms and requirements for all types of appropriate metadata. • The organizers propose to host a discussion among the workshop participants on the need, scope, and direction for HDF-GEO.
Which buzzwords would fit? … geo ref profile data model naming rules best practices …ilities HDF-GEO self docu -mentation metadata content data levels atomic & compound types tools test suites markup & schema platform support
Questions for discussion by Earth science practitioners - Bottom-up analysis: • What are the successful features of existing community data formats and conventions? (HDF5, HDF-EOS, netCDF, CDF, GRIB BUFR, COARDS, CF-1, NITF, FITS, FGDC RS extensions, ISO, geoTIFF, ...) • Progress being made -- John Caron's Common Data Model • Specific needs for geo- and time-referenced data conventions • Specific needs to support observed (raw), calibrated, and analyzed data sets
Questions for discussion by Earth science practitioners - Top-down analysis: (1 of 2) • What is a profile? Consider specifics of how a standard or group of standards are implemented for a related set of uses and applications. • How does a profile relate to a format or other elements of a standard? • What constitutes overkill? How much profile would be beneficial, and how much would be difficult to implement and of limited utility?
Questions for discussion by Earth science practitioners - Top-down analysis: (2 of 2) • Why is it useful? • Establishes specific meanings for complicated terms or relationships • Establishes common preferred terms for attributes which can be described multiple ways. • Establishes practices which are consistent with portability across operating systems, hardware, or archives • Establishes common expectations and obligations for data stewardship • Clarifies community (and sponsor) long-term expectations, beyond short-term necessity • other ...
Wrap-up • Send your list of • provisional HDF-GEO requirements • goals to be achieved • How HDF-GEO would help • to me at • alan@decisioninfo.com
Motivation: • In many instances, application-specific 'profiles' or 'conventions' or best practices have shown their utility for users. In particular, profiles have encouraged data exchange within communities of interest. HDF provides minimal guidance for applications. HDF-EOS was a mission-specific profile; resulted in successes and lessons learned. HDF5 for NPOESS is another approach. Is it time for another attempt, benefitng from all the lessons, and targeted at a broader audience?
HDF Lessons from NPOESS & Future Opportunities (excerpt) Alan M. Goldberg <agoldber@mitre.org> HDF Workshop IX, December 2005
Requirements for data products • Deal with complexity • Large data granules • Order of Gb • Complex intrinsic data complexity • Advanced sensors produce new challenges • Multi-platform, multi-sensor, long duration data production • Many data processing levels and product types • Satisfy operational, archival, and field terminal users • Multiple users with heritage traditions
S 3 C Comm Comm Comm Data Xmitter Receiver Processing Store SPACE SEGMENT Delivered Raw CCSDS (mux, code, frame) & Encrypt IDPS Packetization RDR Production SENSORS Compression RDR Level Filtration Aux. SUBSYSTEMS OTHER Sensor A/D Conversion Data TDR Level Detection SDR Production Flux Cal. Manipulation Source SDR Level ENVIRONMENTAL EDR Production SOURCE EDR Level COMPONENTS NPOESS products delivered at multiple levels
Swath-oriented multispectral imagery VIIRS – cross-track whiskbroom CMIS – conical scan Imagery EDRs – resampled on uniform grid Slit spectra OMPS SDRs – cross-track spectra, limb spectra Image-array fourier spectra CrIS SDR Directional spectra SESS energetic particle sensor SDR Point lists Active fires 3-d swath-oriented grid Vertical profile EDRs 2-d map grid Seasonal land products Abstract byte structures RDRs Abstract bit structures Encapsulated ancillary data Bit planes Quality flage Associated arrays (w/ stride?) geolocation Sensor product types
Design Process - Experience - Trades& Analyses NPOESS product design development • Intentions • - Use simple, robust standards • - Use best practices and experience from previous operational and EOS missions • - Provide robust metadata • - Maximize commonality among products • - Forward-looking, not backward-looking standardization Requirements - Multi-platform, multi-sensor, long duration data production - Many data processing levels and product types - Satisfy operational, archival, and field terminal users Result Constraints - Processing architecture and optimization - Heritage designs - Contractor style and practices - Budget and schedule Resources - HDF5 - FGDC - C&F conventions - Expectation of tools by others
Observations from development to date • Avoid the temptation to use heritage approaches without reconsideration, but … • Novel concepts need to be tested • Data concepts, profiles, templates, or best practices should be defined before coding begins • Use broad, basic standards to the greatest possible extent • FGDC has flexible definitions, if carefully thought through • Define terms in context; clarity and precision as appropriate • Attempt to predefine data organizations in the past (e.g., HDF-EOS ‘swath’ or HDF4 ‘palette’) have offered limited flexibility. Keep to simple standards which can be built upon and described well. Lesson: be humble • It is a great service to future programs if we capture lessons and evolve the standards • How do we get true estimates of the life-cycle savings for good design?
Thoughts on future features for Earth remote sensing products • Need to more fully integrate product components with HDF features • Formalize the organization of metadata items which establish the data structure • Need mechanism to associate arrays by their independent variables • Formalize the organization of metadata items which establish the data meaning • XML is a potential mechanism – can it be well integrated? • Work needed to understand the advantages and disadvantages • Climate and Forecast (CF) sets a benchmark • Need a mechanism to encapsulate files in native format • Case in which HDF is only used to provide consistent access • Need more investment in testing before committing to a design
Index Attribute n-Dimensional Dependant Variable (Entity) Array Primary Array e.g., Flux, Brightness, Counts, NDVI Associated Array(s) e.g., QC, Error bars dimension n Primary and Associated Arrays
Index Attribute Primary e.g., UTC time or angle Associated Independent Variable(s) Additional e.g., IET time, angle, or presssure height 1-Dimensional Attribute Variables
2-Dimensional Independent Variable Array(s) e.g., lat/lon, XYZ, sun alt/az, sat alt/az, or land mask Multi-Dimensional Attribute Variables Key concept: Index Attributes organize the primary dependant variables, or entities. The same Index Attributes maybe used to organize associated independent variables. Associated independent variables may be used singly (almost always), in pairs (frequently), or in larger combinations.
Issues going forward - style • Issues with assuring access understanding • How will applications know which metadata is present? • Need to define a core set with a default approach • Issues with users • How to make providers and users comfortable with this or any standard • How to communicate the value of: best practices; careful & flexible design; consistency; beauty of simplicity • Ease of use as well as ease of creation • Issues with policy • Helping to meet the letter and intent of the Information Quality Act • Capturing data product design best practices • Flexibility vs. consistency vs. ease-of-use for a purpose
Issues going forward - features • Issues with tools • Tools are needed to create, validate, and exploit the data sets. • Understand structure and semantics • Issues with collections • How to implement file and collection metadata, with appropriate pointers forward and backward • How to implement quasi-static collection metadata • Issues with HDF • Processing efficiency (I/O) of compression, of compaction • Repeated (fixed, not predetermined) metadata items with the same <tag> not handled • Archival format
Possible routes: Should there be an HDF-GEO? • Specify a profile for the use of HDF in Earth science applications: • Generalized point (list), swath (sensor coordinates), grid (georeferenced), abstract (raw), and encapsulated (native) profiles. • Generalized approach to associating georeferencing information with observed information. • Generalized approach to incorporating associated variables with the mission data • Generalized approach to ‘stride’ • Preferred core metadata to assure human and machine readability • Identification metadata in UserBlock • Map appropriate metadata items from HDF native features (e.g., array rank and axis sizes) • Preferred approach to data object associations: arrays-of-structs or structs-of-arrays? • Design guidelines or strict standardization?