570 likes | 602 Views
Performance Measurement 101 PART 1. Oregon Public Performance Measure Association (OPPMA) Annual Meeting 8:45 – 11:45 Session with Rita Conrad, Ken Smith and Laura Wipper Willamette University, Salem, OR. Today’s Session. Part 1: The basics (~60 min) Intro & Brief History Criteria
E N D
Performance Measurement 101PART 1 Oregon Public Performance Measure Association (OPPMA) Annual Meeting 8:45 – 11:45 Session with Rita Conrad, Ken Smith and Laura Wipper Willamette University, Salem, OR
Today’s Session • Part 1: The basics (~60 min) • Intro & Brief History • Criteria • Logic models • Part 2: Small groups (~75 min) • Practice • Report-out • Part 3: Making it happen (~60 min) • Three approaches • Dialogue 2
Late 1800’s-1930’s • New York Municipal Research Board • Audit of Portland, Oregon • International City Management Association (ICMA) • Herbert Simon, Nobel Prize winner
1970’s Productivity • Optimistic Views • Harry Hatry (1972; 1978) • Negative or Cautionary Views • Quinn (1978); • Burkhead and Hennigan (1978)
1980’s – Why Accountants? • Governmental Accounting Standards Board (GASB) • Government Finance Officers Association (GFOA) • Association of Government Accountants (AGA) • Oregon is the best state at performance reporting – by a long shot (Smith, Cheng, Smith, Schiffel 2006 – available online)
Original Oregon Model @ ODOT • Roots in work of Glenn Felix • Productivity matrix • Baseline – or average – levels of performance • Historical best or optimum goals • Relative weights, performance levels and performance index • Efficiency and Effectiveness • Cost vs. Quality • Gainshare pilot at ODOT proves value is in the measures and the conversation
Effectiveness Efficiency Striving for Balance • Cost per product, service or result • Labor per product, service or result • Direct hours • Indirect hours • Administrative hours • Outcome/Goal • What’s Important Along the Way • Customer Satisfaction • Work Life Quality
Work Life Quality • At ODOT • Regularly gauged the capacity of the organization • Sick leave • Safety • Turnover • Employee Survey • Equal Opportunity Employment • Training
Development of Measures • Series of workshops • Involved almost all staff • Aligned activities to purpose, outcomes, goals • Measures were customized by staff • Alignment through Key Result Areas • Some were “roll-ups”
Statewide Rollout • Statewide measurement initiative began in 1991 with 16 agencies – Phase 1 • More phases followed – 100+ agencies • Agency-level measurement facilitated by mentors from previous phases • Expanded upon ODOT model • Efficiency and effectiveness • Tied goal-orientation to Oregon Benchmarks
Some 90’s Success Stories • DMV • Positive experience with a negative index • Ways and Means liked the candid conversation • Adult and Family Services • Moved focus from application processing to family self-sufficiency • Lead the nation in welfare reform • Vocational Rehabilitation • Decided federal goal of two months back on the job not high enough to justify cost • Set goal at 18 months • Department of Corrections • Sharper focus on direct recipient of program efforts • 180 degree swing on recidivism
Original Oregon ModelAdopted across the US • Implemented nationwide by US Army Corps of Engineers • One of first two agencies to act following federal Government Performance Results Act of 1993 • At one point, 70 Federal agencies were implementing the “Oregon” model following success of Corps
Elements of Oregon’s Key Performance Measure System • Tiered, linked system for external reporting • Common language • Logic models • Criteria • Budget
Linking state performance to benchmarks Measuring Upby Jonathan Walters98 Progress Board prepares Performance Measure Guidelines for State Agency Budget Instructions Oregon laws required agency performance measures 93 National Performance Measurement Movement HB 3358 gives performance measurement role to Progress Board SB555 gives Progress Board first time evaluation function LFO and BAM assume more performance measurement and management functions Progress Board prepares KPM-Benchmark Crosswalks for W&M subcommittees 09 08
Benchmarks.oregon.gov Oregon Benchmarks and Performance Measures
Activity #1 Getting to Know You
Why measure performance? To get results. It’s at the core of results-based management • Fosters internal learning and improvementIs the ship running well? • Provides greater external accountabilityIs the ship on course? • Informs policy-making Should the ships change course? 17
CRITERIA What makes a good performance measure? • Common language • Alignment (line-of-sight) • Accurate and reliable data • A few key measures • Time-framed targets • Someone in charge • Comparisons • Customer satisfaction 18
Criterion #1. Common language Altitude • OUTCOME, high-level = Societal result • OUTCOME, intermediate= Entity result • OUTPUT = Product or service • INPUT= Time or money • EFFICIENCY=Input per output or Input per outcome 30,000 ft. 20,000 ft. 10,000 ft. 5,000 ft. 19
Practical application 3. High-Level Outcomes (HLO) – Is the world you are attempting to affect changing? Oregon Benchmark #65, Juvenile Recidivism – % of juveniles referred w/in 12 months … 2. Intermediate Outcomes– Is the strategy producing the desired result? % of treated youth with reduced risk factors • 1. Outputs– Is the strategy getting done? # of youth completing treatment 20
3C Intermediate Outcomes Impact 3B Outputs 3A Strategies to achieve the goal Criterion #2. Alignment - line-of-sight to goals and high-level outcomes HLO - Mission or Benchmark 2 Goal 1 Key Performance Measures 3 21
Criterion #3. Accurate and reliable data • Trustworthy data is essential.Example: verifiable records trump estimates • Per measure - at least one data point, preferably several • Data should match the measure. 22
Juvenile recidivism DECREASES. “So That” High-Level Outcome % of served youth with mitigated risk factorsINCREASES. “So That” % of high risk youth completing program INCREASES. Intermediate Outcome More public interest More agency influence Intermediate Outcome Criterion #4. Few key measures – different measures for different purposes Goal: Prevent Juvenile Crime “So That” How high up do you want to go? % of grantees trained INCREASES. 23 Output
Criterion #5. Time-framed targets 2003 Actual: 20 • TARGET = Desired level at a given point in time • Ambitious but realistic • Set targets using: • trend data • comparisons • expert opinion Example: hours of travel delay per capita per year 2007 Target: 15 24
Criterion #6. Someone in charge • Management, staff and stakeholders need to know where the buck stops • Critical for improving performance 25
Criterion #7. Measures of customer satisfaction • This is a required Key Performance Measure (KPM) for Oregon state agencies. • Rationale: Those who pay for their government should be satisfied with their government. 26
Criterion #8. Comparisons “We resolved complaints within three weeks on average.” Is this good or bad? • How does actual performance compare to industry standards? • Rationale: to better understand whether action is needed. • Difficult but not impossible! • Agency suggestion: “peer” agencies develop common measures and comparator 27
High-level outcomes 2 AgencyGoals 1 3 Performance Measures LOGIC MODEL - a tool for thinking it through • Getting started • IMPORTANT: Engage those whose performance will be measured. • Have a clear statement of goals. • “Look up” to mission & high-level outcomes (benchmarks) • “Look down” to what you are doing and how you are measuring performance 28
HLO: % of cities with neighborhood organizations. 3C INTERMEDIATE OUTCOMES % participating citizens with improved understanding Customer satisfaction ratings Impact 2 3B OUTPUTS# citizens trained. # C.I. guidelines distributed. GOAL: Citizen involvement (C.I.) in land use planning 1 3 3A Agency Performance Measures STRATEGY Jointly sponsor, with cities, regional educational events for private citizens every quarter. Logic model example #1 – Citizen involvement in land use planning 29
HLO Juvenile Arrests (OBM#61) 3C INTERMEDIATE OUTCOMES % of juveniles in JCP programs with significantly mitigated risk factors. Impact 2 3B OUTPUTS# grants awarded by county # days of TA delivered by county GOAL Reduce juvenile crime. 1 3 3A Agency Performance Measures STRATEGY Award grants to local contractors to conduct “best practice” juvenile crime prevention programs (JCP). Logic model example #2 – Juvenile crime 30
HLO: % of kindergarteners ready to learn (Benchmark #18) 3C INTERMEDIATE OUTCOMES % of children from participating (trained) families entering school ready to learn Impact 2 3B OUTPUTS# grants awarded by county GOAL Healthy, thriving children. 1 3 3A Agency Performance Measures STRATEGY Award grants to local contractors to design/deliver “best practice” parent education classes Logic model example #3 – Healthy, thriving children 31
PART 2:Small group exercise – 20 min. • Pick an organization from within your group • Pick a goal from that organization • Work through the logic model as a team • Select two key measures – the most important – for box #3 • Select a spokesperson to report out 32
Performance Measurement 101:Part 3 Oregon Public Performance Measure Association (OPPMA) Annual Meeting 8:45 – 11:45 Session with Rita Conrad, Ken Smith and Laura Wipper Willamette University, Salem, OR 33
Part 3 – Making it Happen • Performance Management • Integrated across many systems/disciplines • Multiple Steps that usually include: • Strategy Reports with data Discuss/Argue Learn/Change Revisit Strategy • No single dominant view – but common tools: • Balanced Scorecard, Strategy Maps, LEAN, Six Sigma, TQM, Variance Analysis, Organization Alignment/Incentives, etc. • Numerous barriers, challenges and gaps
Some Models for Measurement Cell Organ Person
Background • Laura has worked in ODOT and other agencies for almost 20 years • Ken has studied performance reporting by cities and states for over 17 years
Why create a “model” • Our hope is to help participants create their own mental models of: • Where they and their organization are at • What are the next steps/models they can use • What choices and priorities they should consider as they move forward • The models are not intended to be judgmental – where one level is always better or more appropriate….the costs/benefits for each approach need to be carefully considered….not every model works for every person
Measuring to Managing Integrated Macro System Measures Approach Operating Unit System Unconnected Measures: Quantitative or Qualitative Only Integrated Measures: Goal-Based Leading & Lagging Macro & Micro Inter-Connected Measures: Organization-Based Program-Based
A Model for Integrated Measures DV = IV + IV + IV Organization's Measures Societal Outcomes Environmental Measures Other Orgs' Measures DV: Dependent Variables IV: Independent Variables
An Example using Education Integrated Macro System Measures Approach Operating Unit System % At Grade Level Adequate Progress After School Programs Demographics & Behavior Test Scores Attendence Parent Volunteers = % At Grade Level Literacy Rates Literacy Strands + +
DV = IV + IV + IV Organization's Measures Societal Outcomes Environmental Measures Other Orgs' Measures + + =
“5 Principles” by Bob Paladino • Similar to many other models – I like his since he uses both government and business examples – and book available on Amazon $25 • I also like his model since it focuses explicitly on the need for: • a Chief Performance Officer (CPO) • Using multiples tools (BSC, LEAN, Maps, etc) p.s. I also like him because he is a CPA
Five Key Principles • Establish CPM (Corporate Performance Management) Office and Officer • Refresh and Communicate Strategy • Cascade and Manage Strategy • Improve Performance • Manage and Leverage Knowledge