170 likes | 263 Views
Ted Habermann. How do we measure and visualize improvements in NOAA Documentation?. NOAA Documentation Improvement. More Complete. Good. Completeness (Rubric Scores). Less Metadata. More Metadata. Less Complete. Not So Good. Record Count. NGDC Solar Metadata History.
E N D
Ted Habermann How do we measure and visualize improvements in NOAA Documentation? NOAA Documentation Improvement More Complete Good Completeness (Rubric Scores) Less Metadata More Metadata Less Complete Not So Good Record Count
NGDC Solar Metadata History Late in 2011 Bill Denig, the Chief of the Solar and Terrestrial Physics Division at NGDC, decided that, in order to understand this “metadata thing “ he had to actually work on some metadata. He started working with one record on his desktop using Oxygen (the dreaded XML editor) and the NGDC rubric. He got a very high score and was able to extend his experience to several other records. He achieved high scores with a small number of records. Bill’s experience and confidence increased and two months later he extended his collection to include records translated from FGDC to ISO. This increased the number of records and decreased the scores (FGDC translations generally yield scores between 16 and 20). Since then Bill has improved existing records and added new, high-quality records to steadily increase his average score.
Geophysics Components Components are re-useable pieces of documentation that allow “normalization” of information in metadata collections. One Month
% Records with Data Access Service Link Data Access Service Types Offered Accessibility % NOAA Documentation Dashboard # T Historic Trend WMS WCS DAP Esri WFS None Metadata Completeness Scores Metadata Dialects Used Documentation % of Records None ISO FGDC OBIS DC Free Text Managers Scientists and Data Managers
Documentation Metric Services Collector Web Accessible Folders Calculate Metrics Scores Database Validation Bad Links Bad xLinks Rubric Scores Bad XML Unique Contacts Component Report
Collection Characteristics Completeness Goal: Improve metadata when you revise it: Understanding Discovery FGDC Most Metadata Are Old Sporadic Work Periods Legacy Metadata Revision Date Recent
Line Office / Program Process Data Collectors/Providers Standards Experts Data Stewards Data Users • 1. Identify Expertise Data Stewardship Teams • 2. Assess Discovery Use Understanding Collections Datasets Services Initial Evaluation • 3. Create and Improve Spirals Wiki Rubrics On-Going Evaluation Consultation / Guidance Use Cases / Needs • 4. Publish/Preserve Catalogs WAFs On-Going Input
Documentation Capabilities • WAF • Unified access to metadata with multiple views • Automated custom processing and metadata quality checks • Translation from many dialects to consistent international standard • Harvest targets for Geospatial One Stop and data.gov • Spiral Tracker • Consistent rubric score calculation • Score distributions help identify improvement steps • Database provides history and access to scores / records • Supports stewardship teams and managers • Wiki • Provides community guidance, examples, successes • Information available and shared with national and international partners
Required NOAA-Wide Support • NOAA Line/Staff Offices and Programs must build on a strong foundation of support across all of NOAA • Develop and implement common metadata management tools • Use rubrics to establish a baseline and monitor progress • Promote and highlight good examples • Support training specifically targeted at improving NOAA’s data documentation • Initiate teams to work on “special documentation problems” that cross Line and Staff Offices • Encourage and support participation in the ISO and Open Geospatial Consortium (OGC)
Vision and goals, identify standards, describe common nomenclature, identify responsibilities Directive Homogeneous NOAA Documentation Improvement Components Efficiently provide consistent guidance implementations, evaluations, and metadata management capabilities for all Line Offices and Programs Tools Community guidance, training, examples, best practices, broad input , Line Office and Program plans Plans Heterogeneous
Metric Calculation System Collector THREDDS Record Scores Rubric XSLT (TBD) FGDC ISO Record Scores Existing XSLT (NCDDC) Rubric XSLT (TBD) Web Accessible Folders Desktop Editors METAVIST , CatMDEdit, ArcCatalog, XMLSpy, Oxygen Web Tools MERMaid, inPort, NMMR, Geonetwork
Current Test Cases FGDC inPort CoRIS 1642 Records NMFS ~400 Records Collector NOS 737 Records ISO NcML DIF NESDIS/ OAR 1521 Records UAF 552 Records GOSIC ~400 Records
Metadata Content Independent of standard Spiral 1: Initial Content Spiral Development / Training Check Back With Data Collectors/ Providers Check Back With Users Standard Guidance / Implementation Spiral 2-N: Scientific Questions New Requirements New Use Cases
Spiral Development / Training: Potential Spirals Discovery Understanding Identification Id Title Abstract Resource Date Topic Category Theme Keyword Metadata Contact Science Contact Extent Geospatial Bounding Box Temporal Start/End Vertical Min/Max Place Keywords Text Searches Purpose Extent Description Lineage Statement Project Keywords Content Information Attribute Type Attribute Names Attribute Definitions Attribute Units Connection OnlineResource: Linkage (URL) Name Description Function Distribution Distributor Contact Online Resource Distribution Format Data Center Keywords Browse Graphic Acquisition Information Instrument Platform Instrument Keywords Platform Keywords Quality/Lineage Sources Process Steps Quality Reports / Coverages
Spiral Tracker Specific Records WAF Score History Score Distribution http://www.ngdc.noaa.gov/idb/struts/results?t=103068&s=20&d=25