1 / 12

Effective support for Enlargement Evaluation issues

Effective support for Enlargement Evaluation issues. Michael Berrisford, Head of DGElarg / E4 Operational Audit and Evaluation October 19, 2009. Contents. Introduction to implementation and evaluation Timing of programming / implementation Retrospective v prospective evaluations

hankerson
Download Presentation

Effective support for Enlargement Evaluation issues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effective support for EnlargementEvaluation issues Michael Berrisford, Head of DGElarg / E4 Operational Audit and Evaluation October 19, 2009

  2. Contents • Introduction to implementation and evaluation • Timing of programming / implementation • Retrospective v prospective evaluations • Project evaluation v programme evaluation • Different responsibilities for different evaluations by component per IPA IR • Evaluations available by next summer relevant to IPA • Some of the evaluations completed and planned • Some messages from already completed evaluations • 2010 ‘IPA mid-term evaluation review’ - description • Future communication of Elarg evaluation plans and results • To Beneficiaries • To Member States • To Public

  3. IntroductionTiming programming v implementation

  4. IntroductionRetrospective v Prospective evaluations • Retrospective (ex-post, interim esp in mulit-annual progs) • Reviews of intervention logic /results of past projects or programmes • Reporting of results at annual programme level necessary – but full ex-post evaluation perhaps more useful over a longer period (e.g. after accession or when instrument changes) • Prospective (ex-ante) • Any other review or analysis to support programming • Not necessarily based on reviewing past projects or programmes • Mixed (interim in shorter e.g. annual programmes like IPA) • Retro on relevance, probably efficiency and perhaps effectiveness, • Prosp forecasts probably on impact and sustainability

  5. IntroductionProject v programme evaluations • Project results monitoring/ evaluation • Operational management responsibility • ~300 projects just under C1 (national/regional progs) each year • Typical evaluation questions concern whether contract outputs are delivering project results objectives according to OECD (DAC) criteria* • Programme evaluations • Central evaluation responsibility • What is a programme? Not always obvious - see later comments • Typical evaluation questions concern whether project results are delivering programme impact objectives according to OECD (DAC) criteria* • * Relevance, efficiency, effectiveness, impact, sustainability • Project: effectiveness ~ outputs, impact ~ inluenceable results • Programme: effectiveness ~ project results, impact ~ wider impact

  6. Different responsibilities for evaluationsfor different components (per IPA IR) • Commission • Ex-ante - all MIPDs+ op programmes C1 where necessary: • Interim - for C1/2 prior to conferral of management • Ex-post - all components where necessary (except C5) • National Authorities (NB after conferral of management only) • Ex-ante - most operational programmes except C1 • Interim - all components • Ex-post - op programmes C5 only • NOTE • - Operational programme = annual programme = closer to project level, - - MIPD = roughly a strategic programme level • Refs from IPA IR: Common provisions= art57, C1=art82, C2 (MS/BC)=art109, C2 (BC/BC)=art141, C3/4 =art166, C5=art199

  7. 2. Some EC relevant/ partly relevant currentretrospectiveevaluations / results monitoring • Project level (operational management) • All programmes : ROM (6 centralised countries+ regional) and other monitoring reports (all countries) + occasional project evaluations • Programme (DG Evaluation unit level) • CARDS (up to 2006) (component 1 equiv) • 7 reports for 6 centralised pre-candidate countries + regional • PHARE (up to 2006)CBC programmes (component 2 equiv) • Romania / Bulgaria • CARDS 2005/2006 +IPA 2007 (component 1,2) ‘Country programme interim evaluations’ (mixed R+P) • 3 decentralised countries Tu, Hr + fYroM (decentralised 2010?) • Centralised countries - under review.

  8. 2. Some EC relevant/ partly relevant current prospective evaluations (programme level) • Reviews of intervention logic (project selection / MIPD objectives v programme/ sectoral objectives) (re component 1- by country/ regional) • Original summary reviews of all MIPDs • More detailed Turkish instrument MIPD (already done) • Similar detailed MIPD planned for 5 countries cmpnent 1 • Regional cooperation intervention logic also planned • ‘Rule of law’ cross-country evaluation – compnent 1 • Rule of law, judiciary reform, fight against corruption/ organised crime - across WB countries …+ others

  9. 2. Messages from existing evaluationsSome common messages CARDS retrospectives • Project design and needs assessments weak, • Implementation generally satisfactory • Acquis related areas more effective than political criteria and development areas • Weak national capacity prejudices sustainability

  10. 2. Messages existing evaluation(re component 1) ‘MIPD’ prospective evaluations (inc Turkey) • No explicit global assessment of all needs for accessionwhich must be satisfied by financial programmes/projects. • ‘Hard’ acquis chapter needs + less specific political/ economic needs • So no SMARTmulti-annual sectoral plans/ objectives are set • 3yr MIPD often too short a horizon to provide framework • how will we know when the assistance in the sector has finished? • We have multi-sectoral annual ‘programmes’ (=projects) - without SMART objectives at this ‘programme’ level. • So in practice projectsare relevant – but • MIPD not yet providing a clear strategic programme framework • No plans established to deliver all accession objectives efficiently across all sectors (/chapters).

  11. 3. IPA mid-term evaluation review 2010. • Not a separate evaluation • A metaevaluation to synthesise messages from all available IPA evaluations (EC + national) - across all components as appropriate. • Aiming for at least some draft conclusions by July 2010, final report Sept-Oct 2010 • Output should include more guidance on • where and how to improve strategic plans and objectives • how annual programmes are built from them

  12. 4. Communication of evaluation plans and results • Beneficiary countries • Regular dialogue already with decentralised NIPAC services • Discussion with centralised NIPACs where appropriate • Member States • Elarg E4 evaluation plan to IPA committee early in year for information. • Presentations of 1-2 selected evaluations possible at each meeting (not all can be presented) • Discussions ongoing with other DGs/components how to present • Public • Most E4 evaluation reports published on Elarg public internet site

More Related