640 likes | 649 Views
This article discusses the value of incorporating evaluation into career services and highlights the subversive approach to challenging funders' attitudes and providers' practices. It provides a comprehensive framework for evaluating career interventions and emphasizes the need for evidence-based outcome-focused practice.
E N D
Evaluation as a Subversive Activity (Demonstrating the Value of Career Services) Kris Magnusson Dean of Education Simon Fraser University Bryan Hiebert Educational Psychology & Leadership Studies University of Victoria
Overview • Background information • Evaluation approach • Implementation planning • Some sample tools
A Challenge by Canadian Policy Makers: “You haven’t made the case for the impact and value of career development services!” A research team formed in 2004 to follow-up • The Canadian Research Working Group for Evidence-Based Practice in Career Development • 10 researchers from 7 universities & 1 foundation
Why “Subversive”? • The goals of funders of service do not always align with the goals of service providers • Funders may not recognize legitimate and useful results of service (e.g., progress towards a goal) • Practitioners often believe that they are too busy, and that “evaluation” is too complex • Evaluation is rarely planned at the DESIGN stage, where it can be most effective; it is usually “bolted on” to a program
How to address the problem We need an approach that is: • Comprehensive enough to include what is needed • Simple enough for people to use; and • Incorporates evaluation into standard practice
What Gets “Subverted”? • Changed attitudes from funders • Changed practice from service providers • More evidence for the field to justify the power and efficacy of career interventions
General Approach to Evaluation Showing worth, documenting impact, relating success stories ….
Evidence-Based Outcome-Focused Practice InputProcessOutcome Need to link process with outcome
Evidence-Based Outcome-Focused Practice InputProcess Outcome • Indicators of client change • 1. Learning outcomes • Knowledge and skills linked to intervention • 2. Personal attribute outcomes • Changes in attitudes • Intrapersonal variables (self-esteem, motivation, independence) • 3. Impact outcomes • Impact of #1 & #2 on client’s life, e.g., employment status, enrolled in training • Societal, economic, relational impact
Outcomes of Counselling • Client learning outcomes • Knowledge • Skills • Impact on client’s life • Client presenting problem • Economic factors • Third party factors +PrecursorsPersonal Attributes
Precursors Intervene between learning outcomes & impact outcomes • Attitude • Motivation • Self-esteem • Stress • Internal locus of control • Belief that change is possible
What outcomes are you achieving that are going unreported or unmeasured? • Client empowerment • Client skill development • personal self-management skills • Client increased self-esteem • Client changes in attitudes • about their future • about the nature of the workforce • Client knowledge gains • Financial independence • Creation of support networks • More opportunities for clients These are legitimate areas for intervention
Evidence-based Outcome-focused Practice InputProcess Outcome • Activities that link to outputs or deliverables • Generic interventions • Working alliance, microskills, etc. • Specific interventions • Interventions used byservice providers • Skills used by service providers • Home practice completed by students • 2.Programs offered by schools or agencies • 3. Involvement by 3rd parties 4. Quality of service indicators • Stakeholder satisfaction, including students
Evidence-based Outcome-focused Practice InputProcessOutcome • Resources available • 1. Staff • Number of staff, level of training, type of training • 2. Funding • Budget • 3. Service guidelines • Agency mandate 4. Facilities 5. Infrastructure 6. Community resources
Inputs • Number of staff • Staff level of training • % or staff time on various tasks • Staff: client ratio • Client case load • Program dollars • Client volumes • Types of client problems • Adherence to mandate • Cost-effectiveness Be realistic about what you can deliver, given the level of resources you can commit
Outcome Focused Evidence-Based Practice Quality Improvement InputProcessOutcome Activities Counsellor & Client Resources • Client change • Knowledge • Skill • Attribute • impact
Outcome Focused Evidence-Based Practice Dynamic and Interactive Resources Counselling, Linking process to outcome Process Outcome
Intervention Planning & Intervention Evaluation Intervention Planning Framework Client Outcomes• Knowledge• Skills• Attributes• Impact Context:Client Needs Client Goals Counsellor Strategy Client Strategy 1 3 2 20
Client • Context • Needs • Goals Outcome Focused Evidence-Based Practice • Resources available • Context: Structure of opportunity • Staff: Number of staff, level of training, type of training • Funding: Budget • Service guidelines • Facilities • Infrastructure • Community resources • Activities that link to outcomes or deliverables • Generic interventions • Working alliance, microskills, etc. • Specific interventions • Strategies linked to specific client problems(stress, grief, depression, career, etc.) • Client home practice • Other • Programs & Workshops • Facilitation guides • Intervention manuals • External Referral PROCESSES INPUTS OUTCOMES • Indicators of client (Learner) change • Learning outcomes • Changes in knowledge and skills linked to the program or intervention used • Progress Indicators End Result Indicators • Personal attribute outcomes • Changes in intrapersonal variables e.g., attitudes, self-esteem, motivation, etc. • Progress Indicators End Result Indicators • Impact Outcomes • Changes in the client’s life resulting from application of learning
Accessibility Regular hours Extended hours Physical accessibility Resources in alternate format Ease of access, who can access Timeliness % calls answered by 3rd ring Wait time for appointment Wait time in waiting room System requirements Adherence to mandate Completion of paper work Service standards Staff credentials, competencies, resources Service delivery Client volumes Client presenting problems Number of sessions Responsiveness Respect from staff Courteous service Clear communication Overall satisfaction % rating service good or excellent % referrals from other clients Quality Service Delivery Need to negotiate these with funders
Negotiated outcomes • If you are lucky, funders might identify personal attributes [client motivation, improved job satisfaction, increased self-confidence] or knowledge, or skills, as accountability indicators • BUT more likely funders will identify impact outcomes [employment status, enrolment in training, reduced # of sick days, increased productivity, etc.] or inputs [client flow, accessibility, timeliness of paper work, etc.] • So service providers need to identify the knowledge, skills, personal attributes that will produce the impacts and negotiate these as accountability indicators Be careful what you promise to deliver BUT deliver what you promise
Outcome Focused Evidence-Based Practice InputProcessOutcome Need to link process with outcome • What will I do? • What are the expected client changes? • What do I expect clients to learn? • What sorts of personal attributes do I want my clients to acquire? • What will be the impact on their lives? • How will I tell? • Professional Practitioner
Model Summary:Integrating Evaluation into Practice • Evaluation is part of service delivery • Not bolted on the side of service • Service = Intervention + Outcome
Integrating Evaluation into Practice • Understand Service Foundations &Client Context • Describe Desired Outcomes • Describe Core Activities • Select Measures and Scales • Collect Evidence • Work with the Data • Report Results & Market your Services
Understanding Service Foundations & Client Context What factors outside of the intervention might have an impact on the results? • Nature of services • Context of service
Describing Desired Outcomes • What do we want to achieve? • Type of client change being sought? Be careful to avoid attempting to do too much in too short a time frame
Describe Core Activities • What do we (service provider and client) need to do to achieve the outcomes? • Link services to outcomes
Select Measures and Scales • What will be the indicators of success? • Process factors • How well did the service provider deliver the service as intended? • How well did the participants follow the program as intended? • Outcome factors • Indicators of progress + ultimate indicators of success • Knowledge • Skills • Personal attributes • Impacts
Collect Evidence • How do we collect evidence most efficiently? • Process assessments should not take more than 5 minutes to complete • Outcome assessments should not take more than 10 - 15 minutes to complete
Work with the Data • How do I make sense of the data I have collected? • Frequency counts and percentages • Mean scores • Measures of association In most cases, sophisticated statistics are not necessary
Report Results & Market your Services • How do we use the data to convince others? • Work from the macro to the micro • Demonstrate movement in your results • Ensure that decision-makers actually see the results of your work Tell the people who need to know in language that they can understand
Points to Ponder • Consider these Assessment Alternatives • For group (workshop) interventions • For individual interventions • Not saying these are better (or worse) than what you are currently doing • Only encouraging you to consider other alternatives
Assessment as Decision Making (vs. Judgement) Please use a two-step process • Would you say that your level of mastery of the attribute under considerations is • Then assign the appropriate rating • 0 = really quite poor • 1 = just about OK, but not quite • 2 = OK, but just barely • 4 = really very good • 3 = in between barely OK and really good unacceptable acceptable 0 1 2 3 4 0 4
Points To Ponder Trustworthiness … • Inter-rater reliability • Would 2 different people give the same ratings? • Intra-rater reliability • Would the same person give the same rating on 2 different occasions? More choice alternatives is not necessarily better
Problem with skill self-assessment • Participants asked to rate their skill (or knowledge) before and after a program • Often, pre-workshop scores are high and post-workshop scores are lower • People find out as a result of the workshop that they knew less than they thought or had less skill than they thought • Based on the new awareness, post-scores are lower • People don’t know what they don’t know • How can we get around this problem?
unacceptable acceptable 0 1 2 3 4 Assessing Learning & Attribute Outcomes Post-Pre Assessment • We would like you to compare yourself now and before the workshop. Knowing what you know now, how would you rate yourself before the workshop, and how would you rate yourself now? • Please use a two-step process: • Decide whether the characteristicin question is acceptableor unacceptable, then • assign the appropriate rating 0 4
Orientation Workshop: Summative Evaluation Results • 156 ratings (6 questions times 26 people): • 84 (54%) ratings were unacceptable before the workshop • 0 ratings were unacceptable after the workshop • 6 (4%) ratings were excellent before the workshop • 108 (69%) ratings were excellent after the workshop • Mean scores before and after the workshop • Before, all were unacceptable (<2) • After, all were more than minimally acceptable (>3)
Orientation Workshop: Formative Evaluation Unacceptable Acceptable
Process Results & Formative Feedback Big Picture, Core Motivators, De-Motivators were most engaging & most useful. Possible Career Options and Reality Check were least engaging & least useful, perhaps because participants has done similar exercises before.
Post-Pre Assessment: Bridging Health Care with Self-Care • Knowledge of stress and stress control • Knowledge of how to manage personal change • Knowledge of nutrition and nutrition control - - - - - - - - - - - - - - • Level of stress • Level of nutrition (high=healthy) • Level of fitness • Confidence in ability to manage personal change Presenting the results
Post-Pre ResultsBridging Health Care with Self-CareStress & Stress Control Start of Program End of Program Vertical axis indicates % of participants
Informal Evaluation of Headache, Pain, and Related Affective States
Self-Monitoring Headache 0 - No headache 1 - Low level, only enters awareness when you think about it 2 - Aware of headache most of the time, but it can be ignored at times 3 - Painful headache, but still able to continue job 4 - Severe headache, difficult to concentrate with demanding tasks 5 - Intense incapacitating headache
Headache Monitoring Grid Level 5 4 3 2 1 0 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 01 02 03 04 05 (time of day) Before treatment After treatment