530 likes | 711 Views
Policy, Evaluation and Practice. Professor Roger Ellis University of Chester University of Ulster. Professor Roger Ellis OBE. Professor Roger Ellis OBE DSc DPhil MSc BA(Hons) C Psychol AFBPS FHEA TCert Professor Emeritus in Applied Psychology University of Ulster
E N D
Policy, Evaluation and Practice Professor Roger Ellis University of Chester University of Ulster
Professor Roger Ellis OBE • Professor Roger Ellis OBE DSc DPhil MSc BA(Hons) C Psychol AFBPS FHEA TCert • Professor Emeritus in Applied Psychology University of Ulster • Emeritus Professor of Psychology University of Chester • Visiting Professor University of Bedfordshire • Visiting Professor Napier University • Visiting Professor Buckinghamshire New University • Visiting Professor Kent State University • Visiting Professor Hokkaido Imperial University Japan • Visiting Professor NOSM Canada • Director of SHEU International
SHEU International • Social and Health Evaluation Unit (SHEU) • United Kingdom Ireland Canada New Zealand Norway Sweden Netherlands • Programme Evaluation • Evaluation of Programmes in • Education • Health Care • Social Care • Community Safety • Community Development • Regional Development
SHEU International • You Innovate we Evaluate
Partnership • Partnership • Programme Evaluation • Knowledge Transfer
Partnership • University : Community • University of Pecs : Baranya Council • UofP : BC; SHEU
Overview • Knowledge Transfer and Programme Evaluation • Programme Evaluation : Nature and Scope • Programme Evaluation: Health Warnings • Partnership
Knowledge Transfer and Programme Evaluation • Knowledge Transfer from University of Pecs to Baranya Council and vice versa • Programme Evaluation as Knowledge • Transfer as Partnership : Pareto Net Gain
Knowledge Transfer Partnership • KTP : Euro speak • Transfer from Higher Education to Community • Knowledge • Needs Analysis • Problem Identification • Intelligence and Innovation • Expertise for Projects • Programme Evaluation • Continuous Improvement
Policy, Evaluation, and Practice • National and Regional Social Policy • Economic, Social, Educational, Health Programmes • Delivery of Programmes :Programme Practice • Impact of Programmes • Nature of Programmes • Programme Evaluation • Feedback for Practice and Policy
Evaluating Evaluation: A Public Health Warning • Everyday vs Professional Evaluation • The Field of Programme Evaluation Research • Public Health Warnings for Programme Evaluation • Programme Evaluation : Baranya County Council and the University of Pecs • SHEU International
Don’t Confuse Everyday and Professional Evaluation • Characteristics of Everyday Evaluation • Characteristics of Professional Evaluation • Main differences
Everyday vs Professional • Places value on something/anything: thing, person, activity, etc. • Based on implicit/explicit standards • Based on liking/disliking, wanting/not wanting, suitable/unsuitable • Intuitive • Individual • Places value on a programme • Based on specified standards for outcomes & process • Based on reliable and valid data gathering • Evidence based • Takes account of everyone’s views
Subjective and internal An opinion Partial One perspective Basis for individual action Objective and external A considered, explicit & transparent judgement Comprehensive Triangulated perspectives Leading to recommendations for policy, programme and practice Everyday Vs Professional
Evaluation Model Policy Social Science Professional Evaluation Programmes Practice
The Field of Evaluation Research • Professional Associations/Societies • Departments and Chairs • Journals • Fields of Application • Regional Development • Health • Social Services • Education • Community Safety
Programme Evaluation • Programme evaluation is a systematic method for collecting, analyzing, and using information to answer questions about projects, policies and programmes • Did it work ? • What happened ? • Did it do what it said it would do ? • Did it achieve its predicted outcomes? • What happened that wasn’t expected? • How did it work? • What caused what ? • What did it cost ? • Was it cost effective ? • What did people think of it ? • Would it have happened anyway ? • What is it being compared with ?
Why Programme Evaluation ? • Accountability • Feedback for Practice • Feedback for Policy • Dissemination • Justification for Funding • Publication and Publicity
For Academics • DON’T DO IT !!
Unless……. • You want to apply social science to the real world • You are prepared to work close to the market • You don’t mind selling your skills • You’re prepared to seek grants with about a 10% success rate • You’re prepared to be accountable for your product • You have strong social commitment • You are concerned for Regional Development Policy, Programmes and Practice
Trident What do people think of the programme? Is the programme meeting it’s objectives? How is the programme working? Outcomes Process Multiple Stakeholder Perspectives
Trident Outcomes Logic Model Process Reconstitutive Ethnography MSP Realistic Evaluation
Heed the Iron Law of Evaluation • “The expected value of any net impact assessment of any large scale social programme is zero.” (Rossi) • The more you evaluate a programme the more you will find it doesn’t work. • Overly pessimistic: use knowledge to improve • What isn’t working, what has gone wrong/what is working, how to put it right, • how to improve
Bring Order to Chaos • Enthusiasm, commitment and a flurry of activity • Structure for evaluation and data gathering • Trident • Outcomes • Process • Stakeholder Views • ‘Shaping the Future’ Developing Health Service Staff: Unclear outcomes;badly managed process;no clear customers
Beware of Going Native • Identification – Involvement • Programme delivery loses objectivity • Evaluator becomes provider • Speech and Language Therapy Evaluation
Don’t Expect to be Welcomed • The unwanted guest at the table – forced on programme • Low priority • Evaluation never as interesting as the programme (nor should it be) • Bridesmaid never the bride • Partnership for Improvement (SSBC)
Don’t Offer More Than You Can Afford • Infinite demand for a free good • Convinced contractors are insatiable • Watch the data gathering (money goes) • Harmonize with monitoring • Sharing Education Programme
Know the Programme • Where does the programme start and finish • Construct a Working Model of the Programme • Define the boundaries for evaluation • Specification may be incomplete • Self Harm Intervention Teenagers
Be There at the Start • Formative as well as summative • Baseline and progress measures • Continuous improvement • Sharing Education Programme
Be Clear on the Outcomes • Sometimes just not there – invisible • Outcomes but no performance measures • Process as outcome • Personality Disorder Networks
Beware Solo Flying • Not just one person’s view • Three heads are better than one • Triangulated evidenced judgments • More expensive but vital
Don’t let the Piper call the Tune • Tell me what I want to know • Confirmation not disconfirmation • Turkeys and Christmas • Say what’s right but positively and improvement oriented • Don’t just find fault: find a solution
Unearth the Theory • What do providers think is happening? • What do they think causes what? • Test hypotheses from theory made explicit • Hungarian Social Work placement
Watch the Time • Time scale for outcomes • Immediate, short term, medium term and long term • Shared Education Programme
No Replication without Recollection • Capture the process • Recipe for replication • Avoid the vaguely significant and the specifically irrelevant (Bannister) • Reflections of practitioners – reconstitutive ethnography • Clinical Facilitator process bestseller unfortunately it was free
Don’t Believe All They Promise You • The final report sweetener • Funding streams dry up • Fashion and fluctuations
Beware Specialists Bearing Gifts • Evaluator and specialist • Face validity • Axes to grind
Don’t be tempted to manage the programme • Evaluation Steering Group meeting place for providers • Evaluation, recommendations involved in action may become management • Hungarian Placement – only place • Evaluators become providers – more ideas than them
If you offend do it with good will • Rigorous but sympathetic • Positive and diplomatic • Shared understanding improvement oriented • Evaluator as counselor and support • Chester Community Safety Centre
Look around you • 360 degree perspectives • Recipients, providers, associated providers, commissioners, funders and managers • Complementary to statistical outcomes • Phenomenology of programme • Anti Social Behaviour Order
Recommend, Recommend, Recommend • Evaluation lives on through recommendations • Policy • Programme • Practice • Recommendations to those who can implement • Village Hall Utilisation
Expect Indifference But Hope for Action • Reports easily marginalized • Original purpose forgotten • We don’t believe you or We knew already • Evidenced recommendations
Evaluation May Not be Research • No one cares what happens in Pecs on a Thursday • Findings should have general theoretical , empirical , methodological applicability providing evidence is valid and reliable • Genuine innovation • Greatest interest in approach & method rather than findings.
Programme Evaluation UofP: BC Needs Analysis Programme Development Programme Evaluation Continuous Improvement Pareto Net Gain
Programme Evaluation :Baranya County Council and University of Pecs • Programme Evaluation as Knowledge Transfer • Partnership between Council and University • Project based • Programme Evaluations
Programme Evaluation and University of Pecs • Applied Social Science • Internal programme evaluation • Institutional Research • Programme Evaluation in Local Community • PE in Region • National PE • International PE : Comparative Studies
SHEU International • Social and Health Evaluation Unit • SHEU International :United Kingdom Ireland Canada New Zealand Norway Sweden Netherlands • Hungary ???
Slogans • No Innovation without Evaluation ! • You innovate :we evaluate • No programme too small for evaluation
Working with SHEU • Free consultation on possible Programme Evaluation • Programme Evaluation contracts • Proven Trident Method • Wide range of Programme Applicability • Evaluation support • Evaluation training