430 likes | 623 Views
Programme. Overview of project aims and developmentLevels of learning frameworkApproaches used to measure outcomes Southampton Bournemouth Sussex ARU/OBU/YorkDiscussion. Aims of OSWE. To test the feasibility of outcome measures and research designsTo generate high quality evidence abo
E N D
1. Approaches to the Measurement of Outcomes in Social Work Education (OSWE Project) Hilary Burgess, John Carpenter, Kish Bhatti-Sinclair, Anne Quinney, Imogen Taylor and Michelle Lefevre
2. Programme Overview of project aims and development
Levels of learning framework
Approaches used to measure outcomes
Southampton
Bournemouth
Sussex
ARU/OBU/York
Discussion
3. Aims of OSWE To test the feasibility of outcome measures and research designs
To generate high quality evidence about the effectiveness of methods of SWE
To build capacity and capability amongst academics and trainers, including users
To use opportunities to compare and contrast practice between programmes
4. Learning about the processes What enables and hinders attempts to evaluate social work education within and between programmes?
How can users engage successfully in programme evaluation?
How well has the capacity building model worked?
5. Collaborative capacity & capability building model Peer learning through Action Learning Set
Support, advice and mentoring, F2F, e-mail and and by phone (e.g. reviewing draft tools)
Site consultations
Proformas
Blackboard site www.ole.bris.ac.uk
Active engagement of a range of stakeholders
6. Key issues
Making projects manageable
Focussing on measuring outcomes
Piloting
Ethics approval
SU&C engagement
Engaging students
Engaging colleagues
7. Key issues contd
Time
Researcher/teacher role
Identifying and enlisting comparator sites
Significance of situated community of practice
See Burgess & Carpenter ‘Building capacity and capability for evaluating the outcomes of social work education (the OSWE project): creating a culture change’ article submitted to Social Work Education on OSWE BBd site
8. Primary intended focus of projects measuring learning outcomes (after Carpenter, 2005) Levels of Outcome
1.Learners’ reactions
2.Modifications in attitudes and perceptions
Attitudes
Motivational
3.Acquisition of knowledge and skills
Procedural
Strategic knowledge
Initial skills
Compilation skills
4. Changes in Behaviour
5. Benefits to users and carers
Focus
Attitudes to race and racism
Attitudes to partnership with users
Interprofessional practice
Understanding partnership with SU&C
Communication skills with children
Interviewing and communication skills
Research skills
Acquisition of NOS
Use of research skills
9. Some tools for measuring outcomes Knowledge tests
Self-efficacy/confidence scales
Communication and teamwork scale
Interprofessional learning scale
Interprofessional interaction scale
Interprofessional relationships scale
Vignettes
Concept mapping
Rating videos
10. University of Southampton 3 yr evaluative study to investigate students’ knowledge, understanding and experience of race and racism and how it is addressed in the social science/social work curriculum.
A collaboration between the Divisions of Social Work and Sociology/Social Policy and PACS (People who Access Services - a service user group).
11. Southampton: research design Pilot - 2 questionnaires + 2 interview schedules
First full year – 2 questionnaires + 2 interview schedules to 1st yr SW & social science students taught in large lecture rooms (beginning/end BSc Yr 1)
Questionnaires: examining knowledge, attitudes, experiences & behaviour Tąn=153, T˛n=71
Interview schedule: providing follow up rich qualitative data from volunteer participants Tąn=2, T˛n=11.
12. Southampton: research design issues Participation - introducing project to a large group early in degree programme; do they understand full implications?
Opting in and out – may be difficult to abstain when majority participate, but large number allows anonymity, confidentiality and opting out.
Information sheets - nature and purpose difficult to absorb quickly.
Briefing – important to provide reassurance from the well briefed unit lecturer and follow-up information.
Sensitivity – area of study may generate emotions.
Interviewers – style cohesive, sensitive and confident.
13. Southampton: summary of findings Semester 1 and 2 of the first full year - results to compare:
Learning of individual students (Semester 1 and 2)
Subject – social work and other (social science)
Race – white and black students
Age – differences in perception
14. Soton: extract from Interview 2 Of the 11 volunteers 5 indicated that they had volunteered because they were interested in the subject. They also said…
This is an important subject for university
This is the first time I’ve volunteered for something since I’ve been here
I was nervous about the interview but I’m interested in research so I wanted the experience
15. Southampton: learning Service user representation – wider impact on People Who Access Services (PACs).
Guidance needed (e.g. three terms to identify self question was too specific to SW discipline)
Confidentiality - despite large numbers still possible to identify single student, so will collapse categories.
Questionnaire design (repetitiveness, assumptions and statistical analysis).
16. Southampton: conclusions Change and growth measurable between Semester 1 and Semester 2
Differences detectable between social work and social science students so comparison useful.
No difference in self-rated knowledge between males and females but comparison useful over time.
The impact of the workplace and age.
17. Bournemouth UniversityOutcomes being evaluated Levels 2 (modification of attitudes and perception), 3 (acquisition of knowledge and skills) and potentially 4 (changes in behaviour) See Carpenter (2005).
Project focus: The increase in research skills confidence on completion of year 2 unit Using Research in Practice.
Data collection to be repeated with students undertaking this year 2 unit of study in 2005-6 (cohort A), 2006-7 (cohort B) and 2007-8 (cohort C).
Data collection to be repeated with students undertaking this year 2 unit of study in 2005-6 (cohort A), 2006-7 (cohort B) and 2007-8 (cohort C).
18. Bournemouth: Research Tools Self-efficacy scales developed from the work of Gary Holden (Holden et al 1999, 2002, 2003) and informed by the work of Unrau and Grinnell (2005) and Parker (2005)
Pre and post self efficacy scales (start and end of 10 credit Using Research in Practice module; April-June 06, April-June 07, April-June 08).
Holden, G., Barker, K., Meenaghan, T. & Rosenberg, G. 1999. Research self-efficacy: a new possibility for educational outcome assessment. Journal of Social Work Education. Vol 35.
Holden, G., Meenaghan, T., Anastas, J. and Mtrey, G. 2002 Outcomes of social work education: the case for social work self-efficacy. Journal of Social Work Education. Vol 38 No 1
Holden, G., Anastas, J. and Meenaghan, T. 2003. Determining attainment of the EPAS Foundation Programme objectives; evidence for the use of self-efficacy as an outcome. Journal of Social Work Education. Vol 39 No 3.
Parker, J. 2005. Developing perceptions of confidence during practice learning. British Journal of Social Work.
Unrau, Y.A. and Grinnell, R.M. 2005. The impact of social work research courses on research self-efficacy for social work students. Social Work Education. Vol 24, No 6.Holden, G., Barker, K., Meenaghan, T. & Rosenberg, G. 1999. Research self-efficacy: a new possibility for educational outcome assessment. Journal of Social Work Education. Vol 35.
Holden, G., Meenaghan, T., Anastas, J. and Mtrey, G. 2002 Outcomes of social work education: the case for social work self-efficacy. Journal of Social Work Education. Vol 38 No 1
Holden, G., Anastas, J. and Meenaghan, T. 2003. Determining attainment of the EPAS Foundation Programme objectives; evidence for the use of self-efficacy as an outcome. Journal of Social Work Education. Vol 39 No 3.
Parker, J. 2005. Developing perceptions of confidence during practice learning. British Journal of Social Work.
Unrau, Y.A. and Grinnell, R.M. 2005. The impact of social work research courses on research self-efficacy for social work students. Social Work Education. Vol 24, No 6.
19. Bournemouth: Self efficacy scales A 15 question scale was developed
The first 10 questions were developed in discussion with Gary Holden, the additional 5 questions (about e-learning skills) were developed independently using the same pattern/format (see example provided)
The 10 question scale is also being used at the University of Hull.
20. Bournemouth: reflection Data analysis – opportunity for capacity and skills building in methodology and software – ‘expert’ and ‘novice’ partnership. 2005-6 data inputted, 2006-7 awaiting inputting. Detailed analysis yet to be done.
Ongoing refining of the research design
Reflecting on the multi-layered role of unit tutor delivering and assessing the unit and researching the unit.
Students who completed the scales in 2005-6 and who have now completed the three year BA SW to be asked to reveal their actual performance in the Using Research assignment to check if confidence was linked to performance
Educator v researcher
Research project embedded in the ‘normal’ student experience and provides opportunities to learn about research by being research participants and to provide feedback.
Students who completed the scales in 2005-6 and who have now completed the three year BA SW to be asked to reveal their actual performance in the Using Research assignment to check if confidence was linked to performance
Educator v researcher
Research project embedded in the ‘normal’ student experience and provides opportunities to learn about research by being research participants and to provide feedback.
21. University of Sussex Partnership & Interprofessional Practice Outcomes for evaluation 2006-7, BA and MA Learners reactions
Stage 1 pre module teaching
Stage 2 post module teaching
Stage 3 post first placement
Tools derived from University of West of England (Miers et al 2005) IPE programme; OSWE findings may provide comparison:
Communication and teamwork scale
Interprofessional learning scale
Interprofessional interaction scale
Interprofessional relationships scale
22. Sussex: Plans Response rates high;
Plan to use SPSS for data analysis
2007-8 Plans
To continue data collection Stage 3 post placement
To introduce comparator interprofessional programme – Leicester University.
23. Sussex: Reflection
Ethical issues need careful management when teachers and assessors also collect research data; e.g. delayed analysis due to need to separate data collection by academic staff from data analysis;
Managing ethical issues needs time in a crowded module schedule, particularly early in a programme;
Cost of Research Assistant.
24. University of Sussex Communication Skills with Children & YP To measure outcomes we needed to identify first what constitutes skilled communication – Scie Knowledge Review enabled taxonomy of constituents to be developed (handout)
These would be taught through whole programme not just in focused skills teaching
Evaluation Question: To what extent, and in what ways, does the MA in Social Work Programme contribute to the development of underpinning knowledge, values, emotional and personal capabilities and performative micro-skills and techniques that students need for effective communication with children in social work practice?
25. Sussex (2) Evaluation Questionnaire Students’ personal characteristics: i.e. have particular kinds of student learned most/least?
Confidence in communication with children measured at different stages.
Student feedback on the aspects of the programme that facilitated their confidence and skills.
Objective measure of knowledge about effective communication: case vignette
26. Sussex (3) – Why vignette tool? Not divorced from context: draw on real life scenario (Hughes,1998). Responses are non-referenced and cannot be standardised across respondents (Poulou, 2001).
Vignettes can stimulate spontaneous responses - test students’ capacity to apply knowledge in situ.
Delineates and holds constant the contextual framework – increases standardisability and reliability, thus internal validity.
But needs some caution about interpretation and generalisability. Need to be detailed, plausible, realistic and vivid enough so they respond as if they were in that situation and be concrete, unambiguous, clear and simple (Finch, 1987).
27. Sussex (4) - Pilot of a post-hoc evaluation (cohort ending June 06) Students did become more confident in communicating with children by the end of the programme and showed increased knowledge of core conditions and skills
Both programme and non-programme related elements contributed
Highest importance to students: focused communication skills teaching plus practice learning (programme)
Having personal and pre-course professional experience with children also felt very important.
All pedagogical methods used felt to have some value – range of strategies needed but experiential methods particularly highlighted
No association shown between personal characteristics of respondents (e.g.age, race/ethnicity) and how much knowledge they were able to demonstrate in relation to the vignette but levels of personal and professional experience did
28. Sussex (5) – Pilot Use of vignette tool Did enable respondents to demonstrate knowledge about core conditions and skills for effective communication with children
But does not ask students to rate the relative importance of the core conditions and skills and this made analysing responses more difficult.
Reliability compromised by insufficient time allocated to task - not all students completed it.
29. Sussex (6) Prospective evaluation Began Oct 07 – tool administered 3 times so far, then to be administered at end of programme
Dilemmas of over-kill of method and questionnaire fatigue – modifying vignette each time
Mapping development of knowledge and skills of individuals over time – not all have completed all 3 (or will do all 4)
Trialling using 2 different questionnaires at T3 to avoid ‘same for both’ answers – dangers?
Resource issues re. analysing data
30. Anglia Ruskin University ARU: Measuring changes in students’ learning about working in partnership with service users’ using concept mapping
31. ARU: concept map Time 1
32. ARU: concept map T2
33. Oxford Brookes: The Study Methodology Fixed Outcomes
Cohort Sample
Longitudinal -Baseline to Graduation
Repeated Measures
Multiple Methods
Triangulation of Measurements
Mapping Against the Learning Experiences
34. Consider your current state of competence (knowledge/skills) and indicate a score (from the guide below) for each of the learning outcomes listed.
0 = Can not produce any evidence of competence.
1 = Understands the learning outcome, but can produce only limited or no evidence of appropriate attempts to put it into practice. Much more knowledge/practice needed.
2 = Understands, and can offer evidence of tentative attempts to integrate into current knowledge/skill base.
3 = Demonstrates competence with some regularity.
4 = Advanced understanding and demonstrating adequate level of integration of knowledge, skills, and appropriate application.
5 = Clearly understands and demonstrates consistent and appropriate application of knowledge and skills in practice.