170 likes | 181 Views
MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare. Evaluation Working Group – second training seminar for evaluation staff of 2007-2013 Romanian NSRF and Operational Programmes
E N D
MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare Evaluation Working Group – second training seminar for evaluation staff of 2007-2013 Romanian NSRF and Operational Programmes Evaluation from the External Evaluator’s Perspective Dr Jim Fitzpatrick, Fitzpatrick Associates Economic Consultants, Ireland May 18, 2006
Content • Evaluation as a “process” • Tasks in a typical evaluation process • Methodologies in practice • Common problems in practice • Procurement of evaluation - typical stages… • Some “tips” for evaluation commissioners • Pitfalls the evaluator faces • Some tips for evaluators • Wider issues for the future
Some overall considerations from the external consultant’s perspective… • A wide variety of different contexts (e.g. doing v supervising, policy v service delivery, ex ante v ex post, technical v non-technical) • “Planning” and “doing” closely related • Experience across a wide range of organisations, topics, etc • Overlaps with planning other types of assignments • An external consultancy perspective
Evaluation is a process, not just a technique! • Evaluation is a balancing act between.. • Client, user relations • Research, analysis • Managing the team • Stakeholder Involvement • Time, resources, budget
Tasks in a typical evaluation process… • Establish/Understand Context • who is the “client”? • why is the evaluation being done? • any specific use intended? • what kind of evaluation is needed? • Obtain/Prepare/Agree Brief (ToR) • is there one? • is it clear? • write one? • is it agreed?
Evaluation tasks (continued)… 3. Prepare Work Plan (Proposal) • overall approach (i.e. how interpreting brief, how going about it) • analytical framework (i.e. overall logic) • methodology/techniques (e.g. CBA, CEA, MCA, benchmarking) • work programme (i.e. the data, data collection, e.g. surveys, interviews*) • Evaluation Team/Resources (budget) • no. of people/person days • types of people • necessary expertise (e.g. on technical aspects) *data not just statistics, includes other information
Evaluation tasks continued… 5. Doing the Evaluation • implement method/work programme • client relations • manage team • deal with unexpected issues 6. The Output/Report/Schedule • meet how often, how many, when? • nature of report e.g. length? style? nature? • presentations?
Evaluation methodology in practice… • trying to establish if intervention did (or will) make a difference so “with-without” (scientific method at its core) • formal quantitative techniques very desirable, but very different • MCA/scoring, weighting and ranking most used • others useful are: • before v after (time-series) • places that do, don’t have (“control group”) • “expert” opinion • views of stakeholders • always need some framework for answering the evaluation questions (samples available)
Common problems in practice… • poor initial project/programme design • inability to control for external influences • poor/unavailable indicators (too few, too many, not really capturing essence of intervention) • lack of consensus about purpose of evaluation • “scope creep”
Procurement of evaluation - typical stages… • Policy issue or topic, regulatory requirement • Terms of Reference, brief • Invitations, tendering • Selection, contracting • Inception • Managing, undertaking, analysing • Reporting
Some comments on the procurement process… • You need to balance competition with the need for dialogue with evaluators • Can you invite too many bidders? • Need for guidance on scale • Circulation of all replies to all bidders! • Ability, availability of clients representatives
Some practical tips for evaluation commissioners… • ensure programme/project planning are good (monitoring, evaluation considered at outset) • Make sure the Terms of Reference have: • Clarity • Focus • Indicate Scale • Relationships: • Be open post selection • Avoid surprises • take time to get shared understanding of what’s happening • ensure there is some kind of method/framework being used • performance indicators – use “sensibly”, and note they are the fuel of monitoring/evaluation, they are not it themselves
Pitfalls the evaluator faces… • Misunderstand context • Objectives unclear, not agreed • Client unclear, not agreed • Lack of balance, being one-dimensional • Thinking you know the answer • Work that’s not used in the end • Having no analytical framework • Being over-ambitious • Not having the right expertise • Failing to consult stakeholders • Not allowing time for project/process/management • No “intellectual leadership” • Report not doing work justice
Some practical tips for the evaluator • watch for “scope-creep” • keep re-reading the brief • estimate time needed and double it! • avoid surprising the client • don’t over-promise • structure the report early on • set internal deadlines SATISFACTION = PERCEPTIONS MINUS EXPECTATIONS (S=P-E)
Wider Issues for the Future • extent of evidence-informed, evaluation culture • the need for research, evaluation to “speak” to policy makers • need for more basic, neutral data collection • balance between “independence” and “relevance” • emphasis on costs of research/evaluation v costs of poor policy decision • more inter-disciplinary research, evaluation (e.g. “economic” v “social”) • over-evaluation of some areas, under-evaluation of others