300 likes | 482 Views
Practical approaches for evaluating the impact of health and social care students/practitioners learning with, from and about each other. 22 February 2005, King’s College London (HE Academy). Practical Evaluating the impact Learning with, from and about. Evaluating IPL make IPL explicit
E N D
Practical approaches for evaluating the impact of health and social care students/practitioners learning with, from and about each other.22 February 2005, King’s College London(HE Academy)
Practical Evaluating the impact Learning with, from and about Evaluating IPL make IPL explicit update in the evidence from evaluations of IPE criteria against which one can measure impact Evaluation process Choices evaluation studies and the use of different methodologies evaluation methods for academic-based and practice-based learning Tensions methodological issues around ethics and access to students researching on and researching with people methods that are transferable to, for example, groups of busy professionals/managers working in more integrated services the organisational context within which learning happens balance of developing distinctive professional roles whilst also recognising the benefits of interprofessional working, and how to engage the students in this managing (far too much) data from the students Impact research issues that underpin impact evaluation interim findings translating researching with people into evaluative practices based on transforming multiprofessional teams impact our current interprofessional education has on future professional relationships impact on integrated care management and delivery Participants’ contributions
Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary
Learning with, about & from each other Interaction and action Your The day ahead challenges responses audiences
Key issue (s) for your evaluation practice Questions Challenges Critical incidents Introduce yourself Outline of your IPE involvement issues identified in or for evaluation Your Challenges
Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary
Interprofessional Education • Members (or students) of two or more professions associated with health or social care, to be engaged in learning with,from and about each other. • An intervention to secure interprofessional learning and promote gains through interprofessional collaboration in professional practice.
Paradigms Evaluation approaches Rigorous & robust Deliberate choices! positivist interpretive and illuminative change
Evaluation ‘logic loop’ proving Quality improving Impact Outcomes Effectiveness
Hamdy et al. (forthcoming BEME systematic review) Predictive values of assessment measurements obtained in medical schools and future performance in medical practice Prospective or retrospective Unbiased selection of subjects Similarity of correlated construct Psychometric characteristics of measuring instruments Use of appropriate statistics Attrition bias proving
improving Four guiding principles – that research should be: • contributory in advancing wider knowledge or understanding; • defensible in design by providing a research strategy which can address the evaluation questions posed; • rigourous in conduct through the systematic and transparent collection, analysis and interpretation of qualitative data; • credible in claim through offering well-founded and plausible arguments about the significance of the data generated Ref: HM Government Strategy Unit
Making sense • Makes a contribution • Has a defensible design • Was conducted rigorously • Makes credible claims
Contribution Assessment of current knowledge Identified need for knowledge Takes organisational context into account Transferability assessed Take the test
Has a defensible design Theoretical richness Evaluation question (s) Clarity of aims and purpose Criteria for outcomes and impact Resources Chronology Take the test
Conducted rigorously Ethics and governance Clarity and logic sampling data collection analysis synthesis judgements Take the test
Take the test • Makes credible claims
Take the test Reporting credible claims Reporting credible claims Reporting credible claims Reporting credible claims
Workshop plan 10 30-10 45 Introduction to the day MRH 10 45-11.45 Participants’ Challenges Plenary 11.45-12.00 Coffee 12 00-12 30 Evaluation Rigour MRH 12 30-13 00 Participatory inquiries and evaluation MH 13 00-14 00 Lunch 14 00-15 00 The Challenges Surgery Small groups 15 00-15 30 Meeting the Challenges Plenary 15 30-15 45 Tea 15 45-16 30 Evaluation Impact Plenary 16 30-17 00 Messages from the Day Plenary
Participatory inquiry and evaluation • Uncertainty and tensions • Dimensions of participation • Ethics and the politics of invitation • Approaches • Rapid (organisational) appraisal • ‘Stakeholder’ evaluations • Action for Rigour “If you are a fish what can you know about water ?”
Why do participatory evaluation? • Short term weakness: ‘objectivity’, weighting • But … long term strength: capacity building, credibility • Richness • Homology with interprofessional/collaborative practice?
The Evaluatorscope as a metaphor for evaluators No simple solutions to complex issues and no point in looking for them!
Uncertainty, tensions, challenges • Widening boundaries is “swimming into an unknown current”(Moustakas, 1990) • Conflicting expectations • Governance • Incompleteness • Redundancy of data (and too much) • £/time
Ethics and the politics of invitation • Trust, respect, purposeful working relationships (!) • Methods theorised and politicised • Drawing the boundary – who is in and who is out • Responsibility and respons-ability • Reportage and ownership
Example 1: Stakeholder evaluations • Critical evaluation • Up-front principles • Mixed methods • Visual energy (3rd point of reference, ‘power to’) • Reporting for difference – levels and language
Example 2: Rapid (institutional) appraisal • Purpose • Appreciative inquiry in the present • Working out the details (differences, protocols …) • Creative management …. • A climate in which it is safe to experiment • Interdisciplinary working groups, with external resource persons • Regular documentation and analysis scaling up
Action for rigour • Methods in their (methodological, epistemological) contexts • Visible boundaries • ‘Recoverability’ (Checkland and Holwell 1998)
The Challenges Surgery • Small groups • Take the role of critical friend/external advisor for at least three of the Challenges • Recommend ways forward • Suggest an action plan
The Challenges Surgery Four groups Discussion questions: • What does impact really mean in terms of interprofessional learning and teaching? • What difference does scale make? Effective evaluation strategies for small, medium and large scale IPL&T • What are the epistemological and philosophical tensions in evaluating IPL&T? • How do you find out what it is to be interprofessional? How could this be meaningful for student assessment and evaluating IPE?