270 likes | 357 Views
This slide presentation includes some slides, marked with , that carry voice recordings. When the slide show is run, the recorded narrative will automatically run.
E N D
This slide presentation includes some slides, marked with , that carry voice recordings. When the slide show is run, the recorded narrative will automatically run. • Accompanying this slide presentation are electronic versions of all the documents listed in the references. Moreover, hard copies of some documents, including this slideshow, are made available to participants.
The use and abuse of indicators Martin Gustafsson October 2012
Contents • Introduction • The history of indicator-based planning • Concrete examples from the public and private sectors • The polemics surrounding when and how to use indicators • A case study: Our teacher development indicators • Some pointers for best practices in the South African public sector
Introduction • We won’t spend much time defining what an indicator is. The working definition for now is: “Time series of numbers, generally annual, that helps planning and reporting”. • We also won’t look at classifications of indicators. This is something you have probably been exposed to. But this session will make reference to different types of indicators, including sectoral and organisational indicators.
…Introduction • This session will be about the selection, calculation, documenting, presentation and general use of indicators. • As far as possible, I’ve made use of the available literature. But the literature on the use of indicators is a bit limited. Above all, it tends to be theoretical and largely non-empirical. With regard to the latter it is of course very difficult to draw any causal links between planning techniques and a better society. However, one can examine how planners and the public actually use and perceive indicators. Unfortunately, even this kind of analytical work is difficult to find.
…Introduction • The session in partly about being honest about indicators, in particular about whether the investment we make in this area really makes a difference to the effectiveness of our organisations and service delivery on the ground. So feel free to air your frustrations! • Group discussion • What are some of your frustrations around the use of indicators in planning and reporting? But also, can you think of examples where you believe indicators add value to our work?
The history of indicator-based planning • Statistics on whole societies to serve the interests of rulers and citizens have existed for centuries. • Governments have played a key role. The word ‘statistics’ in fact derives from the word ‘state’. • Ancient civilizations in Asia, Africa and Europe focussed above all on developing information systems that would tell government how much tax to expect from the empire. This included running population censuses – the earliest censuses were probably those of Ancient Egypt.
...The history of indicator-based planning • The modern tradition of national statistical offices can be traced back to the 1833 establishment of the French national office. This in turn owes much to the influence of Napoleon, whose influence in the design of the modern state is immense. • On the private sector side, the analyst Frederick Taylor the industrialist Henry Ford, both Americans, popularised ‘Taylorism’ and ‘Fordism’ in industrial production processes. This involved measuring the productivity of individual workers and groups of workers and firing and adjusting pay accordingly. These approaches were embraced across the world, even in the Soviet Union.
...The history of indicator-based planning • The Great Depression and the Second World War and their aftermath made governments larger and more influential. Collection of economic and social data within countries increased enormously. • On the economic side, work by Keynes influenced the structure of ‘national accounts’ covering aggregate national income and related issues. • In South Africa, Office of the Census, established around 1910, was the country’s first national statistics office. Under apartheid, ‘bantustan’ offshoots were created, which were merged into the national post-1994 institution that eventually became Stats SA.
...The history of indicator-based planning • In 1968, the internationally comparable System of National Accounts (SNA) was started. • New public management (NPM) ideas began changing rich country governments in the 1980s. The focus was on bringing private sector type incentives, performance information and flexibility into public services. Whether NPM has been eclipsed by other philosophies such e-Governance is a matter for debate, but the influence of NPM has undoubtedly been great.
...The history of indicator-based planning • In developing countries, the drive for NPM was influenced by the logical framework approach (LFA), or the use of matrices with goals, outcomes activities and indicators (in other words what is very common in our government plans!). LFA has to a large degree been driven by aid donor agencies.
...The history of indicator-based planning • The 1990s saw reporting on social trends strengthened through the introduction of international indicators such as the UNDP’s Human Development Indicator (HDI) and a variety of education indicators by the UIS.
Concrete examples from the public and private sectors • Group discussion • Look at how social development indicators have been presented in the three documents below, which are deliberately from organisations you are likely not to be familiar with. Think of how indicators have been selected, how numbers have been presented and links to the strategy of the organisation. Think of when one is dealing with indicator values and when one is dealing with some other type of statistic.
...Concrete examples from the public and private sectors • ...Group discussion • The Mauritius Education and human resources strategy plan 2008-2020. Focus especially on pp. 63-66. • The 2012-2017 strategic plan of the education department of the Australian state of New South Wales. Focus especially on p. 10. • The LonminSustainable development report, 2011. Focus especially on rows under heading ‘Transformation’ on pp. 13-14.
The polemics surrounding when and how to use indicators • The attempt here is to focus on those debates that are most relevant to the South African public service. • Complaint 1: Indicator frameworks impose a structure that is too limiting and that prevents more creative thinking (including creative quantitative thinking). They become a distraction rather than a facilitating tool. Rational, linear planning techniques alone … have proven incapable of successfully introducing and sustaining effective and efficient reform (Inbar, 1996: 16). A part of this complaint is that impractical indicators which are ‘in fashion’ are imposed on planners: ...there is a tendency to introduce new ‘vogue’ indicators, and such measurement manoeuvres place unnecessary pressure on existing infrastructure, further disengage staff, and deflect focus away from well-established, long-term indicators of performance. (Hailey and Sorgenfrei, 2004: 15)
...The polemics surrounding when and how to use indicators The worst case scenario is that indicators actually pervert behaviour. This is a very commonly made complaint. The Soviet Union, without the benefit of market prices, even for goods that in capitalist societies would be described as private goods, has for decades had to rely on production targets and performance indicators to control managers. The potentially disastrous consequences of specifying an incomplete or ambiguous target are exemplified by the infamous 'nail' cartoon, recounted by Mullen ..., in which the factory fulfils its monthly nail quota with a single enormous nail, because the success indicator is specified by weight alone. (Smith, 1990: 67) Analysis that supplements the indicators themselves is necessary: A system of indicators should work like a control panel. It facilitates the identification of problems, and allows for their magnitude to be measured. Detailed diagnosis and the search for solutions are done by complementary analysis and research. (Sauvageot, 1997: 17) Possible response to complaint 1: Don’t let indicators dominate the planning and reporting processes. Perhaps limit the number of indicators. Indicators are just one of many planning elements.
...The polemics surrounding when and how to use indicators • Complaint 2: Indicator frameworks create the illusion that social sectors work in mechanical and easily measureable ways. In developing countries in particular, where data quality is often poor, it is often impossible to make reliable year-on-year comparisons. Comprehensive knowledge, organized into a coherent framework, gives the appearance of better control, or reduced uncertainty, and a decrease of risks. The development of over-sophisticated, abstract models and plans of innovation arises from this illusion. Modelling is often undertaken to persuade politicians or administrators who demand certainty before allowing planned change and who too often search for perfection rather than improvement. ... [models] are imbued with almost magical quality of truth and objectivity. Consequently, there is pressure to advocate rationality, which may contribute little or nothing to the change process, rather than into the development of human contacts, involvement and commitment which are true underpinnings of innovation. (Inbar, 1996: 91)
...The polemics surrounding when and how to use indicators Possible response to complaint 2: Indicators create an incentive to improve the quality of data. It is true that whilst the data quality is still poor, indicators are often aspirational rather than truly implementable and any values must be interpreted with care.
...The polemics surrounding when and how to use indicators • Complaint 3: If the political tradition is to set overly ambitious targets, attempts to forecast the future with very rational indicators inevitably lead to difficult tensions between politicians and implementers and, ultimately, manipulation or marginalisation of actual indicator values. In the short run, ritualistic plans and innovations have clear social and political benefits. Ritualism may have a tranquillizing effect. (Inbar, 1996: 92) Critically, goals and targets are often set by those at some distance from those charged with getting targets. Target setters live in different worlds to target getters. Ownership at different levels can be unclear. Delegating responsibility to achieve targets without delegating authority and providing control over necessary resources is often a recipe for frustration and goal displacement. (Lewin, 2011: 5)
...The polemics surrounding when and how to use indicators It has been argued that implementers may try to avoid exceptional periods of success just to avoid having higher targets imposed on them. Difficulties may also arise when performance indicators are used to determine managerial targets .... It is almost inevitable that principals will set their agents targets on the basis of previous levels of performance. That being the case, unless a fairly subtle system of incentives is introduced, managers have a continuous incentive to report modest levels of performance. For to report out-standing performance in one year would raise the principal's expectations about future performance, and therefore extend managers' targets. This phenomenon, known as the ratchet effect, has been endemic to Soviet planning ..., where targets have traditionally been set 'from the achieved level'. (Smith, 1990: 68) Possible response to complaint 3: Use indicators to educate everyone, including the politicians, about what is possible. Perhaps structure reports in such a way that glaring gaps between actuals and targets are not too obvious, without lying about anything.
...The polemics surrounding when and how to use indicators • Complaint 4: The aeroplane cockpit approach to indicators says that you build the aircraft first, then fly it whilst reading the dials and indicators. The problem is that it takes years to improve measurement practices and data quality. One is building the plane as one flies it. Possible response to complaint 4: Yes, one should accept that initially there will measurement instability and comparability problems. The important thing is to admit that initially things will not be perfect. But one should work towards measurement that is stable and consistent.
...The polemics surrounding when and how to use indicators • Complaint 5: In reality, no-one ever worries about indicator values so in the end planners just put any values into the cells to comply the rules. Possible response to complaint 5: Indicators may become disparaged if they are poorly calculated and presented. Indicators that speak to the concerns of managers, politicians and the public and that contain believable information will be appreciated.
A case study: Our teacher development indicators • Group discussion • Look at how teacher development has been dealt with in our own national plan and in one provincial plan. To what extent would the five complaints we have dealt with apply here? How could indicators have been employed better in the two documents? Look not just at the indicators themselves, but also how they link to the rest of the plan.
...A case study: Our teacher development indicators • ...Group discussion • The Department of Basic Education’s 2012-13 annual performance plan. Look especially at the teacher development indicators on pp. 36-38. • The Mpumalanga Department of Education 2011-12 annual performance plan. Look especially at p. 37.
Some pointers for best practices in the South African public sector • On the following slide some tentative pointers are provided to deal with some of the problems planners who have covered the basics when it comes to indicators, still experience.
…Some pointers for best practices in the South African public sector Calculation of past values Selection • The indicator is just the centrepiece – related data analysis must also occur. • Two points of departure: Strategy and available data. • Try have a mix of innovation and maintenance indicators. • Multiple datasets and slippery values are a reality one can’t just ignore. • Have few indicators (you can put more numbers into the narrative). Narrative Target setting • What do comparisons across time and space say? • Don’t torture yourself, it’s politics! • What is just ‘data noise’? • What do upward and downward movements mean? Documentation • Keep technical notes, you or someone else will need them later! • Describe differences between actuals and targets creatively, but honestly. • Accept that indicator specifications stabilise over time. General use • Get feedback on the reports, if no-one ever cares about your indicators there’s something wrong.
References All the following sources are in your electronic source pack: Department of Basic Education (2012). Annual performance plan 2012-2013. Pretoria. Available from: <http://www.education.gov.za> [Accessed October 2012]. Hailey, J. & Sorgenfrei, M. (2004). Measuring success: Issues in performance management. Oxford: INTRAC. Available from: <http://www.intrac.org/data/files/resources/53/OPS-44-Measuring-Success.pdf> [Accessed October 2012]. Inbar, D.E. (1996). Planning for innovation in education. Paris: IIEP. Available from: <http://www.unesco.org> [Accessed December 2006]. Lewin, K.M. (2011). Taking targets to task revisited: How indicators of progress on access to education can mislead. Falmer: University of Sussex. Available from: <http://www.create-rpc.org/pdf_documents/PTA54.pdf> [Accessed July 2011]. Lonmin (2011). Web-based sustainability development report. London. Available from: <https://www.lonmin.com/downloads/pdf/Sustainable_Development/Lonmin_WBR11.pdf> [Accessed October 2012]. Mauritius: Ministry of Education, Culture & Human Resources (2008). Education & human resources strategy plan 2008-2020. Available from: <http://planipolis.iiep.unesco.org> [Accessed December 2009]. Mpumalanga Department of Education (2011). Annual performance plan for 2011/12. Mbombela. New South Wales Education & Communities (2011). 5 year strategic plan. Sydney. <https://www.det.nsw.edu.au/media/downloads/about-us/how-we-operate/strategies-and-plans/corporate-plans/fiveyrs-strategic-plan.pdf> [Accessed October 2012] Sauvageot, C. (1997). Indicators for educational planning: A practical guide. Paris: IIEP. Available from: <http://www.unesco.org> [Accessed December 2006]. Smith, P. (1990). The use of performance indicators in the public sector. Journal of the Royal Statistical Society, 153(1): 53-72.