190 likes | 327 Views
Some perspectives on the importance of policy evaluation. Joost Bollens HIVA- K.U.Leuven. Evaluation of ALMP’s. Why evaluate ALMP’s ? How to measure effectiveness ? Some practical issues Unanswered questions. ALMP’s. Active Labour Market Policies Training for the unemployed ;
E N D
Some perspectives on the importance of policy evaluation Joost BollensHIVA- K.U.Leuven Joost Bollens
Evaluation of ALMP’s • WhyevaluateALMP’s? • Howtomeasureeffectiveness? • Some practical issues • Unansweredquestions Joost Bollens
ALMP’s • Active LabourMarketPolicies • Training for the unemployed; • Private sector incentiveschemes (wage subsidies, start-upgrants,…); • Direct employment programmes; • Counseling, monitoring, job search assistance, sanctions; Joost Bollens
Public expenditureonALMP’s Joost Bollens
WhyevaluateALMP’s? • Active policies : beneficialeffects • Strong beliefs • Is thisreally the case? Impact evaluation • Different programmes in one country • Allequallyeffective? Joost Bollens
WhyevaluateALMP’s? • Evidencebasedpolicy : givenevaluationresults, decideto: • Continue the programme • Expand the programme • Restructureorredesign the programme • Abolish the programme • In the end : a matter of accountability Joost Bollens
Evaluation • Processevaluation • How is the programmeimplemented?, Management quality?, Proper design?, Selectionprocesses?,… • Impact evaluation : effectiveness • Efficiency : costeffectiveness • Twoequallyeffective programmes may have a quite different cost per participant Joost Bollens
Impact evaluation • Effectiveness : a lot of possibleoutcomes • % of participantsthatfind a job, % thatleaveunemployment, % thatfind a stable job orstableemployment,…, • % thatfind a decent job, effectsonhealth, psychologicaleffects, effectsonwell-being Joost Bollens
Gross versus net effectiveness • Observedoutcome : effect of programmeparticipation + effect of factors outside the programme • Therefore, if we observethat 6 monthsafter finishing the programme e.g. 60% of the participants do have a job, thiscannotentirelybeattributedtoprogrammeparticipation : even without participating in the programme, someunemployedwould have found a job within 6 months Joost Bollens
Net effectiveness • In order tofind the proper impact of the programme (the “valueadded”, or the “net effectiveness”, or the “impact”) , we have to correct the observedgross effect : Net effect = [Gross effect] - [the % of participantsthatwould have found a job even without participating] • Sinceparticipantscannot at the same time benon-participants, the red quantitycannotbeobserved (“counterfactual”) and must beestimated Joost Bollens
Estimatingcounterfactuals • Non-experimentalapproaches(includingquasi-experiments) • Several, more orless sophisticated approaches • Basically : compose a comparisongroup of personswho are comparabletoparticipants, BUT whodidnotparticipate • Potentialweakness : comparabilitynot complete, e.g. dueto (self-) selectioneffects. Example : motivation Joost Bollens
Estimatingcounterfactuals • Experimentalapproaches • Basically : take the group of personswho are willingtoparticipate in a programme, and randomlyassign half of themto a experimentalgroup, and half of themto a controlgroup • Experimentalgroup is allowedtoparticipate, controlgroupnot • Results of controlgroup serve as counterfactual • Advantage : betterguaranteeforcomparability, factors like e.g. motivationwillon average be the same in bothgroups • Strong resistance in a lot of countriestothisapproach : “unequaltreatment”. However, given the cost of ALMPs and the intrinsicuncertainty as totheireffects, thisshouldbereconsidered Joost Bollens
Some practical issues • Planning helps • Plan before the introduction of a newprogramme • However, avoid the evaluation of a brand newprogramme • Radicallychanging (orabolishing) a programmebefore the end of the evaluationmakes the resultssomewhat irrelevant Joost Bollens
Somepotentialconflicts • Time is onourside? • Policy makers, evaluation sponsors, programme administrators want immediatelyevaluationresults↔evaluatorwillinsistthat a thoroughevaluationtakes time • Impact evaluationresultsnecessarilywillonlybeavailablesome time afterparticipation • The resulting “this is old stuff”-argument is not per sevalid Joost Bollens
Someotherpotentialconflicts • Different expectations: “usableinformation” (e.g. whatcanbeusedto fine tune the programme) ↔whereasevaluatorsoften are (somewhatmyopically?) in the first place interested in the validity of their impact estimates • Makeevaluation more usefulbyuncoveringrelationshipbetweeneffectiveness and design aspects Joost Bollens
Someotherpotentialconflicts • Moreover, policy makers etc. onlyseemtobeinterested in impact estimateswhen these are positive, whilenegativeresultsoften are downplayedoroutrightneglected • (apparently?) contradictoryconclusions • Meta-analysiscan help Joost Bollens
Remainingquestions 1 Is net effectivenessrelatedto … • …specificgroups? What does (doesn’t) workforwhom and why (not)? • …combination of severalpolicies? Order? • …timing of intervention ? • …labourmarketinstitutions? • … intensityor “dose” orduration? Joost Bollens
Remainingquestions 2 Is net-effectiveness different between … • …public versus private provider? • …local versus nationalprogramme? • …favourable and unfavourable business cycleconditions? • …short run and long run ? (locking in?) • …sample in evaluationstudy, and futureparticipants? (externalvalidity) Joost Bollens
Macro-effects of ALMP’s • Thusfar: effect onparticipants • Butalso : • Effect onnon-participants? Substitution, displacement, deadweight loss,….; • General equilibriumeffects • Effectsonemployment, unemployment, productivity, matchingeffectiveness, …. • Very important, yet a lot of uncertainty Joost Bollens