1 / 42

Donald P. Moynihan

Experiences with performance management in the United States Presentation Towards a more result oriented Flemish public sector, January 10, 2014. Donald P. Moynihan. Part I: Overview. Overview. US Experience - background Errors in understanding performance management

chaz
Download Presentation

Donald P. Moynihan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experiences with performance management in the United StatesPresentation Towards a more result oriented Flemish public sector, January 10, 2014 Donald P. Moynihan

  2. Part I: Overview

  3. Overview • US Experience - background • Errors in understanding performance management • Expectations about implementation • The politics of performance management • Lessons: how do we encourage purposeful use?

  4. Background

  5. National government-wide changes • Government Performance and Results Act - GPRA (1993-2010) • Program Assessment Rating Tool (2002-2008) • GPRA Modernization Act (2010-) • State level variations on these models

  6. Doctrinal logic for change

  7. 20 Years of Learning? • Some lessons on how it went • Partly from study of topic • Reflected in some policy changes, especially GPRA Modernization Act

  8. Expectations about implementation

  9. Is the idea of performance management running out of steam? • OECD 2012 survey • Seems to be less use of performance data than in past • Performance targets not consequential • General sense of disappointment: we have systems in place, have not delivered desired results

  10. Expectations problem • We define performance systems by the benefits we hope will occur (more rational budgeting, more efficient management) • The gap between our aspirations and the observed effects of these rules are usually large, resulting in disappointment • More grounded and accurate description: performance systems are a set of formal rules that seek to disrupt strongly embedded social routines

  11. Confusion: Adoption vs. implementation • Speak of governments doing performance management • What do we mean? • Rules about measuring and disseminating data

  12. Inattention to the use of data • Performance data by itself does not do much • Implementation of performance management means using the data • Why focus on performance information use? • Difficult to connect public actions to outcomes • Intermediate measure of effectiveness – performance information use • Without it, good things we want don’t happen • There are different types of use

  13. The four types of use • Passive – minimal compliance with procedural requirements • Purposeful –improve key goals and efficiency • Political – advocate for programs • Perverse – behave in ways detrimental to goals (goal displacement and gaming)

  14. Effect of performance reforms • Can observe if agencies comply with requirements (passive use), but not other types of use • Performance systems encourage passive use, not purposeful

  15. The Politics of performance management

  16. Apolitical performance reforms? • Performance data associated with neutrality • Offers objective account of the world, and will engender consensus • Reduces the role of politics by offering an alternative basis to make arguments • This is part of political appeal • Has implications for adoption and implementation

  17. Politics of adoption • Elected officials motivated by symbolic values • Primary focus on adopting information reporting requirements, not broader change

  18. Actual pattern of change

  19. One basic reason for confusion • We fail to understand the nature of performance data • We assume data are • Comprehensive • Objective • Indicative of actual performance • Consistently understood • Prompts a consensus

  20. The ambiguity of performance data • Examine sameprogramsbutdisagreeon data • Agreeon data butdisagreeonmeaning • Agreeonmeaning, butnotonnextactionsteps/resources • Notclearonhow data links tobudgetdecisions

  21. The subjectivity of performance data • Actorswillselect and interpret performance informationconsistentwithinstitutionalvalues and purposes • Greater contesting of performance data and less potential for solution seeking in forums featuring actors with competing beliefs

  22. Implications: Political use • Performance data • is socially constructed by individuals subject to personal biases, institutional beliefs, and partisan preferences • has qualities of ambiguity and subjectivity • These qualities make performance management likely to operate as part of political process, not as alternative to it

  23. evidence of advocacy • “Spinning” (Hood 2006) • Claim credit when things go well, deny responsibility when things do not • Advocacy by agents seeks to avoid blame and respond to “negativity bias” • disproportionate citizen dissatisfaction with missed target (James 2011) • political officials pay more attention to high and low performers (Nielsen and Baekgaard 2013) • more bureaucratic explanations of failed performance (Charbonneau and Bellavance 2012)

  24. Stakeholders • Political support for agency associated with performance information use (Moynihan and Pandey 2010) • May worry less about blame, freedom to experiment • Belief that stakeholders care about performance or performance measures associated with bureaucratic use (Moynihan and Pandey 2010) • More performance information use when: stakeholders perceived as more influential, more in conflict, and when there is more networking with stakeholders (Askim, Johnsen, and Christophersen2008; Moynihan and Hawes 2012)

  25. Principal agent argument • Assumption: Use performance data to reduce information advantage that agencies have over center & elected officials • Reality: • Some evidence of partisan biases in implementation • As long as agencies play role in defining, collecting, and disseminating information, they retain information asymmetry

  26. AN example: Program assessment rating tool (PART) • Bush-era questionnaire used by Office of Management and Budget to rank programs from ineffective to effective • Four sections: program purpose and design, strategic planning, program management, and program results/accountability • Burden of proof on agencies • Almost all federal programs evaluated

  27. How might politics affect PART implementation? • Ostensibly neutral reforms may serve—or may be seen as serving—political ends: • Partisan reformers may implement reforms differently if programs/agencies are ideologically divergent • Managers of ideologically divergent programs may perceive bias (whether or not a reform effort is biased against their programs)

  28. Was PART political? • Designed to be good government, politically neutral reform, and qualitative studies do not report overt partisanship, but… • More liberal agencies and programs get lower scores (Gallo and Lewis 2012; Gilmour and Lewis 2006) • PART scores only related to President’s budget proposals for liberal programs (Gilmour and Lewis 2006)

  29. Did politics affect response to PART? • Liberal agencies, though smaller, had significantly higher PARTs completed • Two types of effort: • Observable: self-reported effort in completing PART – higher for managers in liberal agencies (Lavertu, Lewis and Moynihan 2013) • Discretionary: performance information use – lower for managers in liberal agencies (Lavertu and Moynihan 2012)

  30. Why would PART impose a greater administrative burden on liberal agencies? • Liberal agencies likely concerned about making their programs look as good as possible, given preference divergence • Potentially greater scrutiny of liberal programs, requiring more costly agency data collection and reporting

  31. Lessons: How do we encourage purposeful use

  32. When does Perverse Use occur? • Goal displacement – e.g. cream-skimming • Data manipulation – including outright cheating • Becomes more likely when • Data is self-reported • Task is complex and hard to measure • High-powered incentives attached to measures • Especially in contracting • Job-training programs, tuition programs • Policymakers have imperfect knowledge of perversity, amend contracts after problems occur

  33. Next generation performance system? GPRA Modernization Act of 2010 • Quarterly performance reviews • Goal leaders • Chief operating officers/performance improvement officers • High-priority goals • Cross-agency priority goals • For summary, see Moynihan 2013

  34. Continuing challenge: how to make use of performance data • Create learning forums: routine discussions of performance data with supervisors/peers associated with use (Moynihan and Lavertu 2012) • GPRA Modernization Act: quarterly performance reviews • Not just routines, also learning culture • Tolerates error • Rewards innovation • Brings together multiple perspectives • Gives discretion to users • Tradeoff between learning and accountability • Accountability evokes defensive reactions and gaming

  35. Look for actionable data • You might want to measure everything but you can’t manage everything • Problem with PART – equal attention to all goals • Modernization Act: focus on important targets, areas of opportunity (high priority goals, cross-agency priority goals)

  36. Foster goal clarity • Clear goals increase performance information use (Moynihan and Pandey 2010); may not be easy if: • Service has many different aspects • Tension between: • Few enough measures to generate attention • Enough measures to avoid encouraging workers to ignore unmeasured aspects

  37. Appeal to altruism • Appeal to altruistic motivations, not extrinsic reward (Moynihan, Wright and Pandey 2012) • Select goals that motivate • Clear line of sight between goals and actions • Celebrate achievement • Connect to beneficiaries

  38. Integrate program evaluation and performance management • Performance data tells you if a measure moved up or down, evaluations tell you what affects performance • Discussion of evaluations should be incorporated into performance management • Assign evaluation funding for new policies • Example: Washington State Institute for Public Policy provides meta-analyses of research on different policies, and provides return-on-investment estimates to policymakers

  39. Induce leadership commitment • Leadership commitment associated with use (Dull 2009; Moynihan and Lavertu 2012) • How do you create commitment? • Reputation: public commitments and responsibility (high priority goals) • Create leadership positions with oversight for performance (COOs, PIOs, goal leaders) • Select leaders based on ability to manage performance

  40. Conclusion • Welcome your feedback and questions • Performance Information Project: • http://www.lafollette.wisc.edu/publicservice/performance/index.html • dmoynihan@lafollette.wisc.edu

  41. References Askim, Jostein, ÅgeJohnsen, and Knut-Andreas Christophersen. 2008. Factors behind organizational learning from benchmarking: Experiences from Norwegian municipal benchmarking networks. Journal of Public Administration Research and Theory 18(2): 297–320. Charbonneau, Etienne, and François Bellavance. 2012. Blame Avoidance in Public Reporting. Public Performance & Management Review 35(3): 399-421 Gallo, Nick and David E. Lewis. 2012. The Consequences of Presidential Patronage for Federal Agency Performance Journal of Public Administration Research and Theory. 22(2): 195-217. Dull, Matthew. 2009. Results-model reform leadership: Questions of credible commitment. Journal of Public Administration Research & Theory 19(2): 255–84. Hood, Christopher. 2006. Gaming in targetworld: The targets approach to managing British public services. Public Administration Review 66(4): 515–21. James, Oliver. 2011. Managing Citizens’ Expectations of Public Service Performance: Evidence from Observation and Experimentation in Local Government Public Administration, 89 (4), 1419-35.   Gilmour, John B., and David E. Lewis. 2006a. Assessing performance budgeting at OMB: The influence of politics, performance, and program size. Journal of Public Administration Research and Theory 16:169-86. Lavertu, Stéphane and Donald P. Moynihan. 2013. Agency Political Ideology and Reform Implementation: Performance Management in the Bush Administration. Journal of Public Administration Research and Theory Moynihan, Donald P. and Daniel Hawes. 2012. “Responsiveness to Reform Values: The Influence of the Environment on Performance Information Use.” Public Administration Review 72(S1): 95-105.

  42. Lavertu, Stephane, David Lewis and Donald Moynihan. 2013 Government Reform, Political Ideology, and Administrative Burden: The Case of Performance Management in the Bush Administration. Forthcoming in Public Administration Review Moynihan, Donald P. 2008. The Dynamics of Performance Management. Washington DC: Georgetown University Press. Moynihan, Donald P. 2013. The New Federal Performance System: Implementing the New GPRA Modernization Act. Washington D.C.: IBM Center for the Business of Government. Moynihan, Donald, and Sanjay Pandey. 2010. The big question for performance management: Why do managers use performance information? Journal of Public Administration Research and Theory 20(4): 849–66. Moynihan, D., Pandey, S., & Wright, B. (2012a). Prosocial values and performance management theory: The link between perceived social impact and performance information use. Governance, 25(3), 463–83. Moynihan, Donald P. and Daniel Hawes. 2012. Responsiveness to Reform Values: The Influence of Environment on Performance Information Use. Public Administration Review 72(S1): 95-105. Moynihan, Donald, and Patricia Ingraham. 2004. Integrative leadership in the public sector: A model of performance-information use. Administration & Society 36(4): 427–53 Moynihan, Donald P. and Stéphane Lavertu. 2012. “Does Involvement in Performance Reforms Encourage Performance Information Use? Evaluating GPRA and PART.” Public Administration Review 7(4): 592-602

More Related