1 / 15

First Stab at Program Level Quality Metrics

First Stab at Program Level Quality Metrics. MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems. Previous Program Level Quality Metrics Dialog. Dr Frouin’s Criteria Table Strawman tables in Greg Hunolt’s background notes

sabina
Download Presentation

First Stab at Program Level Quality Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. First Stab at Program Level Quality Metrics MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems

  2. Previous Program Level Quality Metrics Dialog • Dr Frouin’s Criteria Table • Strawman tables in Greg Hunolt’s background notes • Telecon in August with suggestion of answering a series of questions as a means to determine quality. Do the answers to questions lead to a metric of overall quality? • Peter’s presentation this morning on quality feedback from survey and his opinions. ESDSWG Mtg, New Orleans

  3. Strawman • A "straw-man proposal", a brainstormed simple proposal intended to generate discussion of its disadvantages and to provoke the generation of new and better proposals. Often, a straw man document will be prepared by one or two people prior to kicking off a larger project. In this way, the team can jump start their discussions with a document that is likely to contain many, but not all the key aspects to be discussed. -Wikipedia ESDSWG Mtg, New Orleans

  4. DISCOVER MEaSUREs Perspective • A project with heritage. We’ve been producing microwave ocean data products and making them available since 1996 under NASA Pathfinder funding. • We have produced, and continue to produce and distribute many ocean products. Each product is assessed for quality before release to the public. • We are currently releasing F16 and F17 SSMIS ocean products. I will use these as an example in this talk. ESDSWG Mtg, New Orleans

  5. What Questions Do We Answer Before Distributing Data Products? • Is the data set complete? • Are any gaps confirmed and documented? • Are the data acceptably intercalibrated to previous data? • Do the data products look as expected? (has a human eye checked the data set?) • Are overall statistics within expected range? • Are statistics for sub-regions or sub-time frames consistent with expectations and previous data?

  6. What Questions Do We Answer Before Distributing Data Products? • Are comparison statistics with “truth” or other data (such as buoy, ship or model winds) within expected range? • Is the data format consistent with previous data and what users expect? • Are files read correctly by read routines, and if not, have changes been made? • Have we completed or updated product documentation?

  7. What Questions Do We Answer Before Distributing Data Products? • Have we informed the users of file format, processing steps, algorithm changes/specifics? • Is a data validation file produced? • Has all web and ftp text been updated? • Have images been made and do the web tools to display them work correctly? • Have we described to users the differences to expect? • Is the data product like any other available, and if so, how does it compare?

  8. What Questions Do We Answer Before Distributing Data Products? • Who have we created this data set for and will it meet those user needs? • Do our tools work on the new data products? • Are any new tools needed? • What advances have occurred since we last asked these questions and should we change?

  9. Important Climate Questions • Have we checked the data within the extended time series? • Are there any spurious trends in the data?

  10. How to Develop Program Level Quality Metrics? • But what do the answers to these questions mean? • What does the program want to know? • Is an external body needed to determine the quality? If so, who? (program? DAAC? Other scientists? General public?) ESDSWG Mtg, New Orleans

  11. ESDSWG Mtg, New Orleans

  12. ESDSWG Mtg, New Orleans

  13. ESDSWG Mtg, New Orleans

  14. ** this can be obtained from user metrics already being collected and from citation metrics ESDSWG Mtg, New Orleans

  15. What’s Necessary if We are Going to Create Quality Metrics? • Agreed-upon definitions of all terms • Understanding of how questions carry meaning across projects (if any) • An understanding of what Martha wants • An understanding of the value of this information and how to communicate it • Agreement on what the questions will be ESDSWG Mtg, New Orleans

More Related