1 / 17

Applying performance based funding to literacy and essential skills

Applying performance based funding to literacy and essential skills Boris Palameta, Karen Myers, Natalie Conte. January 16, 2013. The context. Transition to a new economy, in which skills are the new currency

ianthe
Download Presentation

Applying performance based funding to literacy and essential skills

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying performance based funding to literacy and essential skills Boris Palameta, Karen Myers, Natalie Conte January 16, 2013

  2. The context • Transition to a new economy, in which skills are the new currency • Many jurisdictions moving towards integrated service delivery models along an employment continuum • Interest in re-aligning incentives to improve outcomes for job-seekers, employers, and tax payers • Can performance-based funding (PBF) drive system-wide change? • What can we learn from the experiences of other jurisdictions? Today’s presentation draws a State of Knowledge Review that was conducted in partnership with Workplace Education Manitoba and funded by HRSDC’s Office of Literacy and Essential Skills

  3. Our approach • Literature review - Review evidence on various PBF models (Canada, US, UK, Australia) • Expert review – Interviews with PBF experts in other jurisdictions • Consultations – Consultations with practitioners and government officials in Manitoba and Nova Scotia • Expert panel – Canadian Economics Association conference, June 2012 Throughout the process our approach was guided by input from the project reference group which was comprised of officials from MB & NS.

  4. Introduction What is performance based funding? • A tool for allocating resources to service providers based on measurable performance targets • Shifts the focus from inputs to outcomes • Assumption is this shift will drive innovation in service delivery and achieve desired long-term outcomes

  5. State of knowledge Key findings • Design matters - PBF systems are complex and vary widely in design and effectiveness • Better design can mitigate risk – PBF risks generating unintended consequences, but ‘second-generation’ designs are more successful in mitigating these risks • Promising approaches – Establish meaningful links between practice and performance by paying for client progress along employment and learning pathways Intermediate outcome milestones (“tipping points”) as performance indicators

  6. State of knowledge PBF systems are complex and vary widely DESIGN OF INCENTIVE SYSTEM SYSTEM GOALS Policy objectives Type of incentive Examples: “Work first” – Job placement; human capital development; poverty reduction; productivity Financial Non-Financial (e.g. star ratings) Scale of risk Target population % service-based payments % outcome-based payments Examples: Employment status; income status; work readiness; human capital, demographic Performance targets Benchmark attainment (x$ if x% of clients achieve outcome A) Payment per outcome (y$ per each client achievingoutcome A) Outcomes of interest Process Client outcomes Payment weighting By outcome By client characteristics By speed of placement Immediate Short-term Longer-term Adjustment for factors outside provider control Performance indicators By local economic conditions By client characteristics PROCUREMENT MODEL Competition for incentive Less competitive-non market Open competition-quasi-market Payment based on absolute performance Payment based on relative performance

  7. Three contrasting models

  8. State of knowledge 2. Better design and monitoring can mitigate risk • Even small amounts of PBF may change behaviour, but not all changes are in the desired direction • Early models particularly fraught with unintended consequences (cream-skimming, parking, gaming) • Second generation models are more promising with built-in features that aim to avoid these pitfalls • Key to mitigating risk is not only careful system design, but also commitment to continuous improvement

  9. Performance measures • Choice of measures crucial in determining incentive architecture • Poorly chosen performance measures may create conflicting incentives – obtaining performance payments vs. serving clients • Performance measures have often been: • 1) Outside provider control; i.e. based entirely on outcomes that happen after clients leave the program • No clear connection between services providers offer and outcomes they are paid for • 2) Based on attainment of levels rather than gains from a starting point • Incentives to pick “winners”

  10. Performance measures (cont’d) • Performance measures have often been: • 3) Poor proxies for quality • Program outcomes of interest are often long-delayed • Performance measures typically use short-term proxies (e.g. employment at 13 weeks) for outcomes of interest (e.g. longer-term employment) • But chain of evidence is often lacking • No clear connection between the short-term outcomes providers are paid for and longer-term program impacts • “Hitting the target, missing the point”

  11. Guiding principles for designing PBF systems that work • 1) Use in-program performance measures > In-program measures establish a more immediate and meaningful connection between day-to-day practice and performance • Allow providers to track progress in a timely fashion, understand where and why learners succeed and where they falter, and design interventions to accelerate progress • 2) Measure gains not levels > Most measures have focused on levels attained by clients at the time performance is assessed. • Need measures that include starting points and magnitudes of improvement to convey information about provider’s impact on learner achievement.

  12. Guiding principles for designing PBF systems that work • 3) Measure what counts > Avoid mission narrowing by ensuring that performance measures recognize the full range of program objectives • PBF changes cost/benefit calculus, may encourage development of costly but innovative services; on the other hand, what you do not pay for may be left undone • 4) Identify key milestones > Identify intermediate milestones that can be used to track the progress of clients who may enter at different points (e.g. with different levels of skill, employment readiness, etc.) • Select milestones based on points along the pathway where learners stall or struggle – meaningful transitions

  13. Guiding principles for designing PBF systems that work • 5) Monitor system performance > Build a continuous learning process to respond to unplanned behaviour • E.g. ‘teaching to the test’ • 6) ‘Right-size’ incentives > Ensure performance incentives are neither too big nor too small • Too big  risk management rather than innovation • Too small  if costs of meeting performance targets > performance bonuses, incentives will be ignored

  14. Guiding principles for designing PBF systems that work • 7) Flexible approach to performance targets > Pre-set performance targets are often either too ambitious or not ambitious enough; both can lead to strategic behaviour • A more open-ended approach encourages continuous improvement • E.g. awarding performance dollars according to ‘momentum points’ (i.e. total number of milestones achieved along a learning pathway) • 8) Ensure all targeted clients are served > ‘Level the playing field’ • Design incentives using the principle of equivalent effort whereby each momentum point should require roughly the same intensity of effort to attain. Recognizes that clients with more barriers may require greater effort to transition between milestones.

  15. Guiding principles for designing PBF systems that work • 9) Build provider capacity > Providers may lack knowledge or resources to respond effectively to incentives • Limit competition for performance dollars; encourage collaboration to build tools and practices • 10) Link in-program measures to post-program impacts > Use longitudinal research to establish a chain of evidence between intermediate milestones (potential ‘tipping points’) and longer-term, post-program impacts (e.g. employment, earnings, etc.) • Follow-up with learners to establish the connection between measured performance and client success in the long-term • Use results to refine and improve performance measurement framework

  16. Washington State Student Achievement Initiative Key transitions milestones within a student’s pathway (identified by research as ‘tipping points’). Provides incentives to focus on full range of skill levels

  17. State of knowledge 3. Key features of promising approaches • Rewards achievement of key milestones – Encourages client progress by rewarding achievement of key milestones that, if reached, are associated with further progress and ultimately long-term labour market success • Focuses on balanced set of ‘in-program’ measures – Which helps providers understand where clients succeed and where they falter, and thus provide the data to drive innovation • Driven by a balance of competition and collaboration - Allocates performance dollars according to total number of milestones achieved. Thus while providers have strong incentives to innovate, they are not in competition with each other. Indeed they may be motivated to collaborate to improve outcomes

More Related