140 likes | 312 Views
PDS4 1B Assessment Results. Reta Beebe Dan Crichton February 2011. Introduction. PDS4 Build 1b was completed on January 15, 2011 by an internal PDS assessment team
E N D
PDS4 1B Assessment Results Reta Beebe Dan Crichton February 2011
Introduction • PDS4 Build 1b was completed on January 15, 2011 by an internal PDS assessment team • The purpose is to allow for improvements in the design and implementation of PDS4 ensuring that PDS has a process for involving and including comments from its stakeholders early • The review was performed on the PDS4 emerging standard, specifically • Concepts Document • Standards Reference • Data Dictionary • Data Providers Handbook • Glossary
1b Reviewers • Atmos – Nancy Chanover • Engineering – Emily Law • Imaging – Patty Garcia; Chris Isbell • Geo – Susie Slavney • Mgmt – Ed Bell; Mike Martin • NAIF – Chuck Acton • PPI – Todd King • Rings – Mitch Gordon • RS – Dick Simpson • SBN – Stephanie McLaughlin
Overall Results • Key PDS4 documents and a few PDS4 examples were reviewed • Reviewers provided • Answers to 5 key questions • Mark ups of documents • General comments • Suggestions for improvement • Key issues were in • Duplication, incompleteness, complexity, inconsistencies/conflicts, and ambiguity.
Assessment Questions 1. Do the documents provide sufficient background for the review? If not, how could they be improved? 2. Assess the four fundamental structures. Are they useful? Will they support your needs? Do you have products that you believe will not fit into the structures? 3. Assess the PDS4 core product types. Do they provide an adequate set of baseline templates for constructing new templates and new PDS4 products? What is missing? 4. Assess the structure and layout of the PDS4 product examples? How can it be improved? 5. What overall recommendations do you have for the team? Do you have any suggestions for improvement? General Comments & Suggestions
Question 1 1.Do the documents provide sufficient background for the review? If not, how could they be improved? Results: Many felt that a substantial amount of work and material was provided. Some felt that the organization could be improved. All recognized that there were “gaps” and “holes” that need to be addressed (as cautioned in the exercise). Many suggestions were provided including suggestions on how to better navigate the document set.
Question 2 2. Assess the four fundamental structures. Are they useful? Will they support your needs? Do you have products that you believe will not fit into the structures? Results: Most felt that the structures encompassed their data products (~10 responses) and would support their needs. It was reemphasized that these are fundamental structures for capture and preservation of the data, not necessarily data analysis. There were numerous suggestions that are captured in the detailed results.
Question 3 3. Assess the PDS4 core product types. Do they provide an adequate set of baseline templates for constructing new templates and new PDS4 products? What is missing? Results: • Many (~7) commented that they believed they are adequate. There was some confusion over the word “core product type” and the concept of deriving new product types which led some to wonder whether their discipline-specific product types should be documented. This is a part of the current work that the DDWG is now performing. One felt that there were too many product types.
Question 4 4. Assess the structure and layout of the PDS4 product examples? How can it be improved? Results • The examples seemed to help, but many noticed inconsistencies between documents and examples that need to be addressed. The examples need to be improved and made consistent across documents in the future. XML also was a challenge to some who had to read and view in primitive tools (including text editors). There were also some detailed comments on structuring, naming, etc provided.
Question 5 5. What overall recommendations do you have for the team? Do you have any suggestions for improvement? Results • Significant results were provided and they are captured and being reviewed by the DDWG. • Substantial comments were provided directly in the documents themselves. • Key suggestions include improved coordination of documents to address inconsistencies, conflicts and organization • Many recognized that a) significant good work has been done and b) significant work remains
Summary Finding from the Comments * • Clarification/Ambiguity (62) • Completeness/Incomplete (49) • Complexity (33) • Kudos (31) • Consistency/Conflict (20) • Omission/Missing Items (16) • Duplication (9) • Bugs/Errors (7) • Examples (4) • Focus (2) • Format (7) • Organization (4) * From Hughes and Simpson rollup
Actions • DDWG F2F meeting on February 8-10, 2011 • Addressed and discussed assessment 1b results • Plan for next steps in DDWG • The document editors are meeting regularly to coordinate the documents • Address 1b findings • Prepare for 1c release
PDS4 Assessment/Input Process Build 1a (Oct 2010) Build 1b (Dec 2010) Build 1c (Mar 2011) Build 1d (Aug 2011) Build 2 (Oct 2011) Operational Readiness DDWG Review PDS Internal Review Internal/External PDS Review External Implementers Review (IPDA/Missions/Experts) PDS4 Release
References • Results are posted to the Build 1b Assessment site • http://pds-engineering.jpl.nasa.gov/index.cfm?pid=145&cid=164 • Results include • Answers to assessment questions • Rollup of extracted issues from Simpson • Rollup of extracted issues from Hughes • PPT summary