280 likes | 429 Views
NASA’s Earth Science Data Systems Standards Process. Earth Science Data Systems Working Group Meetings 23 October, 2007 Philadelphia, PA. EOSDIS Evolution 2015 Vision Tenets. 100%. Adoption. 0%. Time. Technology Diffusion. Standards Diffusion. Chasm between:
E N D
NASA’s Earth Science Data Systems Standards Process Earth Science Data Systems Working Group Meetings 23 October, 2007 Philadelphia, PA SPG, ES-DSWG, Philadelphia
EOSDIS Evolution 2015 Vision Tenets SPG, ES-DSWG, Philadelphia
100% Adoption 0% Time Technology Diffusion SPG, ES-DSWG, Philadelphia
Standards Diffusion • Chasm between: • early adopter - technology for strategic advantage • early majority - pragmatic focus. • One path across chasm … “Community-led” • successful practice in specific community • broader community adoption • community-recognized “standards” • Community-led relies on: • trusted endorsements • strong leadership • “whole product” SPG, ES-DSWG, Philadelphia
Modeled after example of Internet “IETF RFC”. Proposed standards are documented as specifications submitted by practitioners within the NASA community. Technical Working Group (TWG) evaluation. What is “implementation” of this specification in NASA? What is “operational” in NASA? The community is invited via email to comment on the specification and particularly to address questions formulated by the TWG. Key stakeholders also solicited. The TWG reports to the SPG and the SPG makes recommendations to NASA for final status of the RFC. The Request For Comment Process SPG, ES-DSWG, Philadelphia
Initial Screening • Initial review of the RFC • Provide RFC submission support • Form TWG; set schedule RFC Community Core Proposed Standard Community Core Review of Implementation Stakeholders Evaluate Implementations TWG Evaluate Implementations and Community Response SPG Recommendation Draft Standard Community Core Review of Operation Stakeholders Evaluate Implementations TWG Evaluate Implementations and Community Response SPG Standard Recommendation Community Core The Three Step Standards Process SPG, ES-DSWG, Philadelphia
History: Data Access Protocol (DAP) • OPeNDAP community very cohesive. Community leader, OPeNDAP Group (Peter Cornillon, James Gallagher, Dan Holloway, ...), provided very strong leadership and successful in “getting out the vote”. SPG, ES-DSWG, Philadelphia
History: OGC Web Map Server • Reviewers wondered why we are asking for a spec review when it was already an international standard • “Operational” difficult objective criteria .. Some respondents said they were serving hundreds of users each day with thousands of accesses to images – but didn’t consider themselves operational. SPG, ES-DSWG, Philadelphia
History: Hierarchical Data Format • Difficult to get many technical spec reviews – only one implementation so not many people had experience with implementing HDF5 from the spec. • Some returned reviews were about usability of HDF5 and not about the technical spec. • Since users are exposed to HDF5 if they get data, should we ask for usability or usefulness review from users? SPG, ES-DSWG, Philadelphia
Process Revisions • Operational Readiness vs Significant Operational Experience • Evidence of Implementation • 3 types of reviews and all reviews can be done simultaneously – some reviewers perform multiple reviews. • Technical specification review • Usefulness for purpose review • Operational readiness SPG, ES-DSWG, Philadelphia
Process Revisions • All proposed candidate standards will have usefulness for purpose and operational readiness reviews • If the proposed standard has been adopted by an another standards organization, then do not need a technical specification review • Acknowledge that technical spec reviews for a spec with single implementation can be very sparse. • Encourage strong community leader – essential to the process SPG, ES-DSWG, Philadelphia
TWG • Review Questions: • Technical Specification • Operational Readiness • Suitability for Use SPG • Initial Screening • Initial review of the RFC • Provide RFC submission support • Form TWG; set schedule RFC Reject Proposed Standard Technical Note Community Review Stakeholders Evaluate Proposed Standard TWG Evaluate Proposed Standard Implementations and Community Response SPG Recommendation Reject Technical Note Recommended Standard SPG, ES-DSWG, Philadelphia
Ted Habermann, NOAA National Data Centers A Standards Framework We all use mental equations to make decisions about technologies that we use. These equations are complex and include multiple terms. I am interested in exploring the equations we use to make decisions about standards. Knowledge Age Standards: A brief introduction to their dimensions, Yesha Y. Sivan, Tel Aviv University and The Knowledge Infrastructure Laboratory, Ltd. SPG, ES-DSWG, Philadelphia
The Endorsement Process • SPG will send recommended standards to NASA HQ Program Executive for Data Systems with the following: • Strengths/ Weaknesses • Applicability/ Limitations • Endorsement is briefed to HQ Earth Science Steering Committee • HQ will disseminate endorsement through NASA CIOs and to general announcement email to community • Required standards will be required of NASA Earth Science programs, projects and awards. • Recommended standards will be applied at discretion of program or studies managers. SPG, ES-DSWG, Philadelphia
Standards Diffusion • What about other parts of the adoption curve? • Should standards process address established practices? • Use by Majority. • Examples: DIF, HDF4 • Can a standards recommendation be to abandon a practice? • Retire a standard SPG, ES-DSWG, Philadelphia
100% Adoption 0% Time Technology Diffusion SPG, ES-DSWG, Philadelphia
Eight RFC’s to date SPG, ES-DSWG, Philadelphia
Process considerations • There may be a certain class of standards for which our Standards Track may be too unwieldy. • This was noted at the July 2007 meetings in Madison • A Process Tiger Team was formed to discuss & make recommendations SPG, ES-DSWG, Philadelphia
Standards Track • Purpose of Standards Track is to • produce a list of standards that should be considered for new projects • Endorsed Standards have been vetted by a review process which included some or all of • Technical Review - is the standard well-documented and implementable? • Usability Review - is the standard easy to use? • Operational Suitability Review - is the standard robust in a NASA setting? SPG, ES-DSWG, Philadelphia
Mature and/or Formal Standards • Mature Standards - have been in use for a long time • Formal Standards - have been produced by a formal process (OGC, ISO, ...) • May not require a two-phase review process • Reviews can be difficult to collect • Potential reviewers don’t understand why the review is needed • Reviews may require highly specialized knowledge or experience • Available reviewer pool can be limited (reviewer fatigue) SPG, ES-DSWG, Philadelphia
Potential rubrics SPG, ES-DSWG, Philadelphia
Discussion • Other criteria that might help distinguish among the types? • Relationship of Standards/Established Practice/Technical Note to Life-cycle concepts? • Can widespread use be quantified? • What about widespread use outside of NASA? SPG, ES-DSWG, Philadelphia
A Standards Framework We all use mental equations to make decisions about technologies that we use. These equations are complex and include multiple terms. I am interested in exploring the equations we use to make decisions about standards. Knowledge Age Standards: A brief introduction to their dimensions, Yesha Y. Sivan, Tel Aviv University and The Knowledge Infrastructure Laboratory, Ltd. Ted Habermann, NOAA National Data Centers SPG, ES-DSWG, Philadelphia
Sivan's Framework Level - users and producers Purpose - aims, both intended and actual Effect - pros and cons, benefits and problems, payoffs and tradeoffs Sponsor - origin of the standard Stage - the process of making the standard SPG, ES-DSWG, Philadelphia
Modified Framework SPG, ES-DSWG, Philadelphia
Meeting Discussion • What does the “stamp” mean? Is there a real difference between Endorsed vs. Established Practice? • Same result, different path - but can you achieve this? • “Vitality” of the spec is an issue for end users (c.f. Sourceforge with software packages) • ASCII is still a very useful spec, but it has not changed for years. • Established - means that you can “buy into” some aspects of what has been done already. • You are “importing” into the process the prior work • Class of things that are part of NASA’s current standard way of doing things • i.e. the original “Near Term Standards” list • NASA’s been using some things for years • Some things are mandated SPG, ES-DSWG, Philadelphia
Meeting Discussion • Think about whether something raises the risk or lowers the risk of successfully building a system • Coefficients into the framework may correlate with riskiness. • Understanding the cost model is also important. Look at DIF. Switching from DIF to ISO may not cost much. However, switching from HDF4 to HDF5 could be very costly. Having a tool to automate the switch changes the cost. • Risk/cost/criteria can be different to different users, so maybe the best we can do is make some recommendations but provide the raw data. SPG, ES-DSWG, Philadelphia
Meeting Discussion • Assuming we want to allow for a third class of spec, how do we go about doing it? • We need to make sure we post the S.W.A.L. analysis to the web site. • Reservations about the spec can be listed in the Limitations section. • We can develop a set of rubrics, borrowing from Ted Habermann’s talk to help. (But this could take a long time) • The SPG simply defines what kinds of reviews are needed and whether they can be internal or must be external before establishing the TWG. SPG, ES-DSWG, Philadelphia