870 likes | 1.18k Views
SECTION I. INDICATORS. Definitions, categories Indicators in the project cycle Designing a system of indicators. References in IPA regulation. Art. 59. SMC Information reporting obligation to IPA MC on progress of […] indicators Art. 167. SMC for RD component
E N D
SECTION I.INDICATORS Definitions, categories Indicators in the project cycle Designing a system of indicators
References in IPA regulation • Art. 59. SMC • Information reporting obligation to IPA MC on progress of […] indicators • Art. 167. SMC for RD component • Examine results of implementation by reference to result indicators • Art. 169. SAR and SFR • Sectoral reports shall include result indicators
Definition • An indicator can be defined as • the measurement of an objective to be met, • a resourcemobilised, • an effect obtained, • a gauge of quality or • a context variable.
Definition • An indicator is made of: • A definition • A value • A unity of measurement • Its measurement requires: • A method (source of data, procedure of data collection) • Allocation of clear responsibilities to personnel
General Criteria of Good indicators • OVI objectively verifiable • SMART • Specific • Measurable • Achievable (acceptable, applicable, appropriate, attainable or agreed upon) • Relevant (reliable, realistic) • Time-bound
General Criteria of Good indicators • CREAM • Clear Precise and unambiguous • Relevant Appropriate to the subject at hand • Economic Available at a reasonable cost • Adequate Provide a sufficient basis to assess performance • Monitorable Amenable to independent validation
General Criteria of GoodIndicators Another approach: • Simple, clear and understandable • Useful - # tiles in your office vs # education sessionsconducted • Valid – does it measure what it is intended to measureand nothing else • Specific – Should measure only the conditions or eventunder observation • Reliable – should produce the same result when usedmore than once to measure the same event
General Criteria of GoodIndicators • Relevant – related to your work • Sensitive – will it measure changes over time • Operational –should be measurable orquantifiable using definitions and standards • Affordable – should impose reasonable measurementcosts • Feasible – should be able to be carried out using theexisting data collection system
General Criteria of GoodIndicators • Indicators should be expressed in terms of: • Quantity • Quality • Population • Time • For example, an indicator written for the programobjective of “Increasing internet access” might specify: • “Increase from 20% to 60% (quantity) of internet access at the NUTS II regions (quality) of Turkey (population)by January 2014 (time).”
Categorisation of indicators Challenging, since certain indicators can belong to several categories
Types of indicators Related to the scope Context indicators: changes in the country under evaluation, the location and the assistance provided Programme indicators: resources, implementation, results, and impacts of an ongoing activity Evaluation indicators: related to a programme's relevance, coherence, efficiency and effectiveness Monitoring indicators: included in the monitoring system
Context indicators • Quantified information • on the socio-economic andenvironmental situation • can express identified needs in quantitative terms • Examples: • Economic development: GDP per capita, Direct foreign investment (% of GDP), Debt (% of GDP) • Quality of life and social well being: Under 5 mortality rate, Life expectancy, Primary education completion rate
Programme indicators • Relate to the effects of the intervention. • Measure the extentto which the effects of a programme are expected to change the socio-economicreality or the behaviour of socio-economic actors, • Expressing the quantifiedobjective of the intervention • Examples: • Number of start-ups, number of tourists visited the beneficiary region, number of employment generated
Types of indicators Related to their nature Elementary indicators: basic information from which more complex indicators can be derived Derived indicators: ratio or rate, from the relationship between two elementary indicators Compound indicators: combination of several indicators (elementary or derived) Specific indicators: for an intervention, but not for comparisons Generic indicators: for comparing several activities Key indicators: for internal comparisons between different activities of a programme and external comparisons with other programmes Context indicators: for a country, population or a category of the population
Types of indicators And more… Descriptive Management Policy Performance Qualitative Quantitative (specific number, percentage) Etc.
Quantitative indicators • Specific number: number, mean, or median • but number of successful examination doesn’t indicate the rate of success • Percentage inidcates the rate of a performance • But rate of preformance doesn’t indicate the size of the success • Both to be used
Qualitative indicators Imply qualitative assessment ‘Compliance with’, ‘quality of’, ‘extent of’ ‘level of’ Changes in institutional processes, attitudes, beliefs, motives, behaviors or perception of individuals Qualitative data is more time consuming to collect, measure and assess Hard to verify because involve subjective judgments
The logic we use in implementation Area of control internal to Organisation Area of influence external to Organisation Impact Inputs Resources Activities Output Outcome Results
Level of indicators What longer term improvements are we aiming at? (national goal) = impact What improvements are aimed at by the endof the strategy period? = outcomes, results What strategic programs should be the focus of thenational response? = outputs What financial, human, material, andtechnical resources are needed? = inputs
The logic from project aspect • A public financial intervention – input • € Millions • produces some (physical) outputs, which are the direct result of a certain operation, • kilometres of a railroad constructed • beneficiary obtains some advantages, results • reducedtravelling time • intervention will affect not only final beneficiaries, but socio-economic environment- impacts • higher GDP, increase quality of life
Input indicators • Monitor the expenditure of the funds available for any operation, measure or programme; • Refer to the budget allocated • Financial indicators: progress in terms of commitment andpayment in relation to itseligible cost • or to other technical or human resources elements • Human resources indicator: number of working days used in relation to total input planned
Output indicators Looking forward to WORK Monitor theproducts of activities Measured in physical units (e.g., Kms of railroad constructed, number of firms supported, number of training days delivered etc.)
Result indicators Looking foreward to ACHIEVEMENTS Monitor the direct and immediate effectson direct beneficiaries (target group) How will we know success or achievement when we see it? Are wemoving toward achieving our desired outcomes?” Changesin behaviour,capacity or performance of beneficiaries Physical (reduction in journey times, number of successful trainees, number of roads accidents, etc.) Financial(leverage of private sector resources, decrease in transportation cost, etc.)
Result indicators A core instrument for programme management • Impact indicators – difficult (impossible) • Output indicators – information only about physical, not socio-economic effects of an action special importance to result indicators • Focus on design of high quality system of indicators • Sound analysis of the context • clear definition of the assumed causal chain, • baseline, • definition of the measurement method and a quantified target.
Impact indicators Monitor the long-term effects beyond the immediate effects Direct/Specific: consequences appear or last in the medium or long term for the direct beneficiaries. Indirect/Global: related people or actors that are not direct beneficiaries Require statistical data or surveys
Impact indicators An instrument for strategy decisions (not a legal requirement) Decisive role in programming cycle: ex ante quantification of impacts is an instrument for the strategic orientation of a programme during its planning phase; only the impacts of a programme found ex post allow a final judgement on the success or failure some impacts will only be measurable after a certain time of programme implementation, e.g. after 3 or 4 years.
Impact indicators how to obtain their values? Data not necessarily obtainable from monitoring system, only from evaluation Value might be due to factors external to the programme considerable time lag in availability Better to limit to most important priorities Need for a sound explanatory model to define causal chain between programme input/output/results and impact values
An example of indicator-set for Entrepreneurship Development Entrepreneurial Dynamism • focus: capacity to adapt to changing market conditions • creation of new enterprises: % of total no. of enterprises EU15: 1% – 8%) • enterprise survival rate after 3 years: total number of start-ups in year ‘n’ created in ‘n-3’ against no. of start-ups in year “n-3” (EU15: 70% – 55%)
An example of indicator-set for Entrepreneurship Development Regulatory and Business Constraints time involved in setting up a company; EU15: 1 – 24 weeks costs involved in setting up a company; EU15: €100 - €2000 business constraints: survey among representative range of SMEs on lack of skilled labourforce; access to finance; technology change; infrastructure
An example of indicator-set for Entrepreneurship Development Capital markets / financial conditions focus: availability of early stage venture capital the total venture capital in % of GDP; EU15: 0,2-0-3% (USA: 0,4%) no. of Business Angel Network, no. of deals initiated by BAN
An example of indicator-set for Entrepreneurship Development Enterprise Dynamics • Innovative Capacity • public expenditure on R&D as percentage of GDP • share of innovative SME in total (%): introduced new or improved products or processes • Knowledge based economy • number of SME using Internet for commercial purposes
Use of indicators in the project cycle • IPA (and even more SF) comprehensive set of objectives – priorities, measure, etc – and wide range of actors – COM, ministries, OSs, OBs, beneficiaries; • OB • broad set of information concerning the measure • controls physical execution of projects physical output and financial indicators • OS • less detailed information about specific measure and very little information about individual projects • objective of the programme priority result and impact indicators + some output indicators related to OB’s efficiency • COM • programme and priority level • result and impact indicators • Challemge: select and record relevant data and direct them to relevant party
Indicators in programming Integration in programing; • Establishment and Management in partnership • Involvement of suppliers and potential users of information • project promoters, beneficiaries • OBs – main suppliers, • OS • monitoring committees, • European Commission, • European Parliament and national parliaments, • external evaluators, • wider public, including civic organizations, • official statistical services. • From temporary working group to monitoring platform
Indicators in programming • Proportionality: • as complex as necessary and as small as possible • impact and result indicators should cover priorities or measures which represent the bulk of expenditure or are of strategic importance • Quality check • system of indicators forcoverage,balance, and manageability; • and individual indicators using quality criteria • relevance, • sensitivity, • availability, • costs.
Indicators in programming • Coherence between programme documents • Coherence with indicators of established EC policies • Role of ex ante evaluation • if benchmarks and past experience do not provide a sufficient basis for establishing and quantifying impact indicators • impact indicators complex task for programmers • verify the causality between outputs, results and impacts • close cooperation with planners
Indicators in programming • Programme Elaboration: Analytical part, definition of context indicators • Definition of Programme Strategy and Priorities • definition of objectives at the Programme and Priority level • establishment output, result and impact and core indicators
Indicators in programming • Planning Implementation Arrangements • designing the monitoring system: electronic data processing, quality check of indicators, • designing the evaluation system: planning evaluation, with a description of indicator data needed to evaluate the Programme; • selecting indicators, information on which should be delivered by an evaluation exercise • Establishing rules and conditions for a smooth and efficient cooperation between monitoring and evaluation systems
Indicators in programming Integration of Ex Ante Evaluation • Ex ante evaluation as a parallel process to Programme design: Close co-operation between the evaluators and programme designers as regards the indicator system, monitoring and evaluation arrangements • Examination of the evaluation recommendations and their possible consideration in the design of the Programme
Indicators during implementation Data collection, updating and transferring to users • Task of OS and OBs • Consolidation, improvement and rationalisation of data • Risk of excessive data requirements • OS to check periodically the reliability of the information collected to provide additional guidance, if needed • Use and improvement of indicator system is a continuous task – strengthening administrative capacities
Indicators during implementation Annual Reporting on Implementation • Preparation of the selected indicator data and their preliminary interpretation for the Annual Reports • possible linkage between interim evaluation exercise and annual reporting
Indicators during implementation Presenting the data to the monitoring committee (cont’d) • Different knowledge and experiences of MC members • OS should: • Put quantitative information into its qualitative context, • Reduce the volume of information provided, compared to current experience, • Present information in standardised manner, • Undertake some preliminary analysis, highlighting critical information, and • Use appropriate presentation techniques • Turn monitoring findings into concrete actions
Indicators during interim evaluation • Evaluation of the programme performance as regards particular priorities or themes by using indicators as necessary • Review of indicators linked to a possible review of the programme strategy • Review of functioning of the monitoring system (quality of indicators, data collection and their transfer to the users)
Indicators during ex post evaluation • Indicators provided from Monitoring system (output and result) • Use of macro-economic models to evaluate impact
Indicators during evaluation • Indicators are major source of information for evaluations • Indicators are most frequently used to measure effectiveness and efficiency ratios
Designing system of indicators • From input-driven implementation to a results-oriented indicator system • Element of judgement is required in addition to data processing
Designing system of indicators A clear, focused strategy • Limited number of priorities • Understanding of intervention logic • Measures priorities programme • Priorities should make explicit its underlying economic and social rationale • Example • what is the mechanism through which capital grants are supposed to enhance the competitiveness of enterprises? • selection of appropriate indicators: instrument to clarify the content of measures and priorities • develop indicators within the discussion on the action
Designing system of indicators Baselines • Baseline data: initial value against which an indicator is measured, e.g no. of SMEs in the region • Concepts: • Static: simple statement of a value for an indicator at a certain reference point in the past • number of SMEs active in research in a certain year • number of SMEs active in research supported by the programme in a certain year of the past • Dynamic: one projects the value of a certain indicator during the programming period (baseline scenario or counterfactual situation) • Use depends on target area: region with a developed road network vs region without major roads
Designing system of indicators Baselines (cont’d) – information sources • official statistics • Problematic • non-availability of data at an appropriate geographical level; • non-availability of data that is sufficiently disaggregated by sector; • delays in the publication of data; • gaps in official statistics in relation to the requirements of the programme (for example no distinction between full-time and part-time workers);
Designing system of indicators Ex ante quantification • quantification of a target for an indicator is a quality check of programming • instruments: use of historic time series and the use of reference or benchmark values • all outputs should be quantified at measure level • in a next step quantification of result indicators for the most important parts of a programme
Designing system of indicators Core indicators • Large number of indicators exist in a programme • COM needs limited number of “core” indicators • Core indicators specific for a programme • describe objectives of each priority in terms of expected results • capture the core elements of the expected changes • explaine in a qualitative manner in the programming document • programme monitoring will pay particular attention to their attainment • link with general policy frameworks, such as the Lisbon agenda • Common minimum core indicators • programme indicators are not directly comparable across programmes • physical and financial indicators used to make comparisons or aggregations of data across similar programmes, priorities or measures