1 / 18

Discussion Mireille Matt INRA - GAEL

Basic questions of policy evaluation design and tuning : a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS. Discussion Mireille Matt INRA - GAEL. Outline of the discussion.

nubia
Download Presentation

Discussion Mireille Matt INRA - GAEL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Basic questions of policyevaluation design and tuning: a quick reminderpresented byVincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion Mireille Matt INRA - GAEL

  2. Outline of the discussion Objective: to discuss the similarities and specificities of the basic questions related to public policy evaluation vs. research result or S&T policy evaluation • Brief summary of paper • Similarities with assessing impact of public research/S&T policy • Specificities of assessing impact of public research/S&T polic

  3. Brief summary (1) • Design and implementation of the interministerialevaluation of the national policy for road safety (control and sanction for departingfrom the highway Code)=> drasticallyreduce the number of deaths on the road • Verydetailedchronological story with a socioanalytical dimension (1999-2003) • Narrates how, withhis social scientist position, hewas able to orient and structure an uncertain and chaoticprocess of policyevaluation and influence the subsequentreforms

  4. Brief summary (2) • Phase 1: struggle withdifferentdepartments in French Ministries (Dpt for modernization and deconcentration, Dpt for security and road circulation…) and the National council for evaluation – back-and-forth => definition of the object to evaluate, the objectives, the perimeter of the evaluation, public authorities and actorsconcerned • Phase 2: implementation of the evaluationproject: design and specifications of the evaluation, methods to beused, externalscientific expert – managing the interaction between the scientifichemisphere (the questionning of the evaluation) and the politico-administrative hemisphere (governance of the evaluationprocess) – Ministries = major stakeholders

  5. Brief summary (3) • Phase 3: Epilogue – Change of the head of the National Instance of Evaluation (budget, definition of specifications, call for proposals…) - choice of winner – steeringcommittee – aim = define a new policy of control-sanction and document all possibilities – wise and persuasive entrepreneur for public policy design=> changes in public policiestowards road safety in 2003 • The results of thisevaluationis due to the head of National Instance of Evaluation – politicalstrategies – involvement social scientists in the evaluationprocess – • Political dimension of an evaluationalong the wholeprocess. How should social scientistsinvolved in this type of processbehave in front of thisomnipresentpolitical dimension

  6. Similarities with assessing impact of public research

  7. Evaluation at different steps of the process of public actions

  8. THE DIFFERENT LEVELS OF EVALUATION: a complex process analysed by VS • E1 - E4 : Relevance (content) and "quality" of conception (decision process) • E1 : Relevance of objectives • E2 : Coherence of objectives • E3 : Relevance and coherence of the "institutional arrangements" • E4 : Coherence between objectives and institutional arrangements • E5 - E7 : Implementation and results • E5 : Programme management (cost-timing-quality…) = monitoring • E6 : Effects / outputs / impacts = effectiveness • E7 : Match between effects and objectives = efficacy (1) • E8 - E9 : Efficiency and legitimization • E8 : Match between objectives - institutional arrangements and effects-outputs-impacts : do the same / do better another way =assessing adequate funding, management, contractual behaviour in order for objectives to be achieved in a cost-effective manner = efficiency • E9 : Ex-post relevance of objectives, given the results of the other evaluations = assessing whether initial objectives are still valid in the light of evolving RTD, societal and environmental conditions = efficacy (2)

  9. General issues towards a « good » impact evaluation system • Impact assessment is only one of the levels of evaluation • Complexity : no single best method ; multi-disciplinarity /quantitative and qualitative information; WITH alignment between (implicit or explicit) theoretical basis and which impacts are to be evaluated • Guaranteing the scientific value of evaluation methods (Robustness, Repetability, Appropriability, Transparence, Independance of the evaluators, Confidentiality, Sampling…) • Balance between systematic / standardized /simple approaches and exploratory studies => systematic issues: Evaluation vs monitoring, time fit between R&D activity and evaluation, Availability of data, Evaluation “fatigue“, Cost of evaluation

  10. General issues towards a « good » impact evaluation system • Ensuringthatevaluationtakes place on a programmed and properlyresourced basis • Providing « easy to understand », « usable », « credible » resultswhileavoidingmeaninglesslist of indicators / scoreboards • Providing a mechanism for feedback of the resultsintopolicymaking (learningpolicy maker) • Interactions betweenacademics, practitioneers, policymakers and researchactors for betterunderstanding of scope, relevance and needs => evaluationis a « social process »

  11. General issues towards a « good » impact evaluation system • Evaluation as a « social process » (cf. L. Georghiou ) Motives / interest of the actorsinvolved in th evaluationprocess • Those « being » evaluated • justification/legitimation • learningatoperationallevel • gain new supports for public sources • Thosewho are the audience of the evaluation • accountability • resources allocation • learningatpolicylevel (pro-activeevaluation) • Thoseperforming the evaluation • academicinterest • consultant business • Makepurpose and context-dependancyclearbeforechoosing an approach

  12. Specificities of assessing impact of public research

  13. Challenges for an « ideal » evaluation system Trends / challenge for evaluators Trends in S&T policy making • Complexity/Separability/Attribution • Multiple stakeholders • Coherence of objectives • Weight of different • dimension of the evaluation • Handling/using the results • Evaluation of policy mix • Mix of « evaluation cultures » • Evaluation of shifts • Legitimacy, project fallacy Multi-level multi-agent decision Variety of goals Variety of Inst. Arrang. Coupling with other policies International collaboration / integration (EU) Flexibility / adaptive - learning policy makers Development of competition-based programmes

  14. Challenges for an « ideal » evaluation system Trends / challenge for evaluators Trends in research activity International dimension Benchmarking Bias toward short term market-type output Limits of peer reviews Separability/Complexity/ Attribution Network evaluation Evaluation of knowledge/competences /capabilities / capacity Limits of bibliometrics Globalization Solving problem orientation Interdisciplinarity Cooperation S-I linkages Increasing knowledge content IPR regulations

  15. ASIRPA Challenges • Evaluation of an organization with multiple missions/objectives • Case study approach with quantification • Identifying the cases within the organization • Considering a variety of impacts (measure) • The observed impact is the result of a complex process involving a heterogeneous set of actors evolving over a long period of time => impact pathway

  16. ASIRPA Challenges • Attribution vs Contribution • Problem of project fallacy • Develop a « standardized » method for presenting and anlysing the case (impact pathway, chronology, impact vector, transversal analysis)

  17. Questions? • Whatcouldwelearnfromyourexperience ? • In terms of methodused to evaluate the impact of the road safetypolicy (by consultant)? • In terms of actorsperforming the evaluation of impacts? Consultant vs Academicactors • In terms of mechanismsused to feedback the resultsintopolicy actions? • In terms of transferring the methoddevelopped?

  18. THANK YOU FOR YOUR ATTENTION !

More Related