1 / 20

A word about what we mean by evaluation (a changing landscape)

Connections and issues in the European evaluation context Professor Murray Saunders President of the European Evaluation Society. Congratulations on the formation of the Slovak Society for Evaluation and welcome to the National Evaluation Societies and Networks in Europe (NESE).

seoras
Download Presentation

A word about what we mean by evaluation (a changing landscape)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Connections and issues in the European evaluation context Professor Murray SaundersPresident of the European Evaluation Society

  2. Congratulations on the formation of the Slovak Society for Evaluation and welcome to the National Evaluation Societies and Networks in Europe (NESE)

  3. A word about what we mean by evaluation (a changing landscape) What the evaluation community in Europe identifies as important What is gained by sharing and working together Discussing the key issue of use and usability

  4. My own background Began evaluation work in 1982 from a research background in educational change, policy and development Help to found the UK Evaluation Society in 1992 President between 2001 and 2003 Chaired the development group which formed the IOCE in 2003 Current president of the European Evaluation Society and Director of the Centre for the Study of Education and Training (CSET), and Professor of Evaluation, Lancaster University Still defining what evaluation is……………………..

  5. My own background

  6. Evaluative practice For me (and this is unstable!!) an evaluative practice is a practice that is the routine, rule governed behaviour prompted by an evaluative impulse i.e. an impulse to attribute ‘value’ or ‘worth’ in some way, to a process, a programme, an object, a policy, a development or an intervention.

  7. Evaluative practice Evaluative practice concerns: the purposeful gathering, analysis and discussion of evidence from relevant sources about the quality, worth and impact of provision, development, policy or practice It is how we attribute value Evaluative practice also concerns things such as : Balancing diverse ethical interests Managing stakeholders Charting a way though ‘difficult’ or ‘inconvenient truths’

  8. Ideas for a first work programme for NESE • Possible objective • Exchanges of information • Presentation of good practices • Monitoring of evaluation activities and context • Encouraging use and usability • Possible means • Page on EES website • Permanent contacts • Continued moderation by EES + a national society • Meeting in Lisbon - October 2008 • Meeting in Munster October 8/9th 2009

  9. Raising politicians’ awareness Promoting evaluation training high Promoting evaluation as a rather high profession rather low Promoting standards low Defining standards 0% 20% 40% 60% 80% 100%

  10. Monitoring evaluation in Europe: • supply of evaluators • education activities regarding evaluation • institutional arrangements within the public sector • activities by the supreme audit offices • pluralism within each policy domain • scope of evaluations

  11. Overall strategy for professional development • Filling gaps • Complementing national training • Contacts, conversations and exchanges of information • Jointly sponsoring • Thinking about capability, competence and standards

  12. Picking up the issues of use and usability……..

  13. “Use refers to the extent to which the outputs of an evaluation are used as a resource for onward practice, policy or decision making” What counts as Use?

  14. “Usability refers to the way an evaluation design shapes the extent to which it’s outputs can be used” What counts as usability?

  15. ‘Use’ enhanced by inclusivity: * Authentication of focus and instrumentation * Interest in outputs/findings * Social capital building * New knowledge as socially owned * Increased chance of changes in practice How do different types of evidence/data determine ‘use’ practices e.g. narratives or stats Political use of different types of evidence How might ‘use’ be encouraged?

  16. Process use refers to the unintended effects of carrying out an evaluation or ‘asking questions’: Foregrounding new issues Drawing attention to ‘hot spots’ or problem areas Forcing attention on difficult areas Providing a ‘voice’ for the powerless Drawing attention to time-lines Making participants think about ‘audience’ and ‘users’ Policing role ‘process use’

  17. use as ‘engagement’ Less more • Dissemination practice • Report • Executive summary • Article • Presentational practice: • Seminars • Presentations • Active workshops • Embodiments • Interactional practice: • Working alongside colleagues • Analysis of situational enabling and constraining factors

  18. Embedded in decision making cycles (clear knowledge on when decisions take place and who makes them) Clear understanding of organisational memory (how evaluations might accumulate) Capacity of an organisation to respond Systemic processes (feeding into structures that are able to identify and act on implications) Organisations that are lightly bureaucratised (complex adaptive systems) are better placed to respond to ‘tricky’ or awkward evaluations Evaluations that are strongly connected to power structures (what does this mean?) Evaluations that are congruent: suggestions based on evaluation need to build on what is already in place. Issues concerning “use”

  19. Reasons and Purposes [planning, managing, learning, developing, accountability] Uses [providing and learning from examples of good practice, staff development, strategic planning, PR, provision of data for management control] Foci [activities, aspects, emphasis to be evaluated, should connect to the priority areas for evaluation] Data and Evidence [numerical, qualitative, observational, case accounts] Audience [Community of practice, commissioners, yourselves] Timing [Coincidence with decision making cycles, life cycle of projects] Agency [Yourselves, external evaluators, combination] Designing evaluations for usability: some critical questions

  20. Key issues going forward? • Evaluation use and usability • Raising politicians’ awareness of evaluation • Promoting research on evaluation • Promoting and defining standards and good practice • Supporting evaluation capacity builders in the public service • Promoting evaluation training • Setting up evaluation societies In our survey of last year (15 societies and networks in Europe), these were some of the issues for the evaluation community in Europe

More Related