120 likes | 129 Views
Learn why measuring success is important and how to effectively measure outcomes in climate observation programs. Explore different types of metrics and discover how performance measures can be useful for integrating, improving, learning, promoting, defending, and communicating program goals.
E N D
Message & Manage Performance Measures For Climate Observations Neil Christerson James Shambaugh Planning and Programming Division, CPO/OAR June 26, 2012
Why should we measure our success? • The “So What?” factor How should we measure our success? • What are the desired outcomes? Meaningful outcomes? • What is measurable? Practical? • How do we know we made a difference? • What are our stakeholders’ needs? What can we measure that’s SMART*? • Specific • Measurable and Meaningful • Audience-directed (and Ambitious) • Realistic (and Relevant) • Time-bound
Definition of performance measure: A statement that compares actual outputs and outcomes to planned outputs and outcomes. • A performance measure consists of four parts: • Performance Indicator • Unit of Measure • Baseline • Target
Anatomy of a Performance Measure How far you will go and how long it will take What you will do Performance Indicator Baseline (Yr) Target (Yr) Unit of measure 12 min (2003) 15 min (2010) Increase Lead Time on Tornado Forecast Warnings minutes Performance Measure “Increase lead time on tornado forecast warnings from 12 minutes in 2003 to 15 minutes in 2010” It’s not a performance measure unless it has all of these elements
An Observations Example How far you will go and how long it will take What you will do Performance Indicator Baseline (Yr) Target (Yr) Unit of measure Reduce error in the global measurement of sea surface temperature 0.66 Co (2003) 0.30 Co (2018?) degrees C Performance Measure “Reduce error in the global measurement of sea surface temperature from 0.66 Co in 2003 to 0.30 Co in 2018(?)”
Types of Metrics(examples on our next slides) • Input (Resources put into the observation system) • Output/Process (Products, services, efficiencies resulting from activities) • Outcome/Impact (An end result - both expected and unexpected, of the customer’s use or application of the organization’s outputs. ) • Good to have a mix of metrics for different components (funders, users, OOPC, IOC, etc) of the OOS
Input Metrics Examples • Number of countries supporting OOS (e.g. number of countries involved in Argo) • Aggregate monetary contributions of above • Number of ships and other support vessels required/contributed to maintain the OOS • Number of scientists (or suitable metric regarding capacity building) involved in observing activities
Output/Process Metrics • Percent of obs available in real time • Percent of obs that pass a QC test • Mean time for 75% of data to appear on GTS • Percent of obs available through the WDC –Oceanography • Listing of published papers that use obs systems • Number of products • Number of platforms or observations • Volume of ocean/seas sampled, density of buoys (increased resolution) • Routine reanalyses generated
Outcome/Impact Metrics • Reduction in climate forecast or analysis product (e.g. SST map) error • Global ocean heat content (and uncertainty) • Global sea-level rise (and uncertainty) • AMOC transport estimates • Number of systems that are considered “sustained” or mature • Number of new technologies adopted • Measure of integration of in-situ/satellite obs system • Improved modeling
Summary:How Performance Measures are Useful Integrating , Improving, Learning – “Managing” your program • Government accountability – e.g., GPRA. Important for reporting to NOAA, DOC, OMB, and Congress. Evidence of progress. • Gauge whether work done is producing desired outputs and achieving desired outcomes • Early warning system – evaluate the need for course corrections or other actions which allows for improved execution and program quality • Integrate and link to work downstream – e.g., modeling, resource management decisions. “System” focus versus individual program focus.
Summary:How Performance Measures are Useful Promoting, Defending, Communicating – “Messaging” your program • Tell a coherent story: Lead a reader to a logical conclusion through the lay out of budgets, activities, performance measures, and desired outcomes (aka, Mapping) • Improve relevance: Articulate WHY what we do is valuable and how it benefits society. • Important for telling the story and justifying budget requests to NOAA, DOC, OMB, and Congress
Questions and Discussion…1. What if outcomes or impacts are not measurable? That is, what if we don’t have influence over the outcomes or impacts?2. How can performance measures help you refine your message and relate what you do to things people want and need?