460 likes | 772 Views
2. Presentation Overview. Background and rationaleDescription of researchOverview of resultsWhat we learned and implications for: Future research evaluation practicesCRWG future plans. 3. Background and Rationale. Lack of knowledge regardingValue of career development servicesImpact of career development servicesMore and better research is requiredIncreasing demand for empirically supported interventions from:Service providersPolicy makersEmployersFunders .
E N D
1. 1 Measuring the Impact of Career Services: Current and Desired Practices
Vivian Lalande , Kris Magnusson, Bill Borgen, Lynne Bezanson, Bryan Hiebert
The Canadian Research Working Group for Evidence-Based Practice in Career Development (CRWG)
2. 2 Presentation Overview Background and rationale
Description of research
Overview of results
What we learned and implications for:
Future research
evaluation practices
CRWG future plans
3. 3 Background and Rationale Lack of knowledge regarding
Value of career development services
Impact of career development services
More and better research is required
Increasing demand for empirically supported interventions from:
Service providers
Policy makers
Employers
Funders
4. 4 Background and Rationale International symposia and pan-Canadian Symposium participants agreed
Need for more effective evaluation systems
Evidence reflecting the impact is needed to inform public policy pertaining to individual, family, organizations, society, etc.
Participants included career practitioners, policy makers, and employers
5. 5 Background and Rationale A challenge by Canadian Policy Makers:
You havent made the case for the impact and value of career development services
A research team formed in 2004 to follow-up on recommendations from Working Connections:
The Canadian Research Working Group for Evidence-Based Practice in Career Development
6. 6 Research Project Goals Better Understanding of Evaluation Practices
Importance of evaluation
Extent of evaluation practices
Types of outcomes identified and/or reported
Types of outcomes desired but not measured
7. 7 Data sources On-line survey (practitioners and agencies)
Importance of evaluation
How impact is determined
Achieved outcomes
How outcomes are measured
2. Focus Groups
NATCON 2005
2 groups (one English and one French)
Purpose was to provide feedback on themes identified in surveys
8. 8 Data Sources (Contd) 3. Telephone Interviews (policy makers and employers)
Desired outcomes of career development services
Desired evidence and data about the services
9. 9 Research Project: Definitions Definition of Outcome
The specific result or product of an intervention including changes in client competence, client situation and/or broader changes for the client and/or community
Definition of Intervention
Any intentional activity implemented in the hopes of fostering client change
10. 10 Research Participants Completed surveys from:
173 agencies (147 English and 26 French)
214 practitioners (168 English and 46 French)
Telephone Interviews
9 policy makers (out of 41 contacted)
7 employers (out of 23 contacted)
10 - 35 minutes
2 interviews conducted in French
11. 11 Importance of Measuring Outcomes
12. 12 Type of Agency Matters (Agency Responses) ?2 = 25.04; p = .02
13. 13 Type of Agency Matters (Practitioner Responses) ?2 = 40.8; p < .01
14. 14 Practice of Measuring Outcomes 84% of agencies actually report the outcomes or impact of their services
Agency differences: (?2 = 32.34; p < .01)
Schools (K-12) less likely to report impact of services
Not-for Profit agencies more likely to report impact
Practitioner differences: (?2 = 47.8; p < .01)
Practitioners in federal government agencies, K-12 Schools, and private practice were less likely to report impact of services
15. 15 Agencies and practitioners: What are the 3 most important outcomes that you report? Change in employment or educational status of the client
and marginally:
2. Skill development; financial independence, connectedness, self-confidence
3. Number of clients served
Client satisfaction
Programs completion
Service delivery
Cost-benefit
16. 16 Agencies: How are these outcomes measured? Frequency counts (e.g., number of clients served/month, number of clients who found employment, number of client action plans created, number of clients who completed programs
Evaluation/follow-up reports
Client self-reports; stakeholder reports; surveys; telephone calls; interviews
and marginally:
Observation of Client Change; Cost/benefit analyses
33% of practitioners did not respond
17. 17 What outcomes are you achieving that are going unreported or unmeasured? Client empowerment
Client skill development (e.g., personal self-management skills),
Client increased self-esteem,
Client changes in attitudes (e.g., about their future, or about the nature of the workforce),
Client knowledge gains
Financial independence
Creation of support networks
More opportunities for clients
and marginally:
Community benefits;
Client satisfaction; Increased queries;
Political lobbying (agency)
18. 18 What evidence do you have that you are reaching these outcomes? Anecdotes
Verbal reports of success from clients
Verbal reports of success from employers
Observations
Observed changes in client skills
Observed changes in client attitudes
19. 19 What difficulties do you face when trying to collect evidence and/or measure the impact of your services? Complexity
Difficulty of determining service outcomes
Belief that some outcomes are unmeasurable
Lack of importance placed on evaluation
Lack of resources (finances and time) allotted
Lack of appreciation for importance of evidence-based practice
Lack of training in evaluation methods
Difficulty in obtaining client feedback
Loss of client contact after services provided; clients unwilling to provide feedback
20. 20 What difficulties (cont.) Lack of uniformity/agreement about outcomes across agencies and funders
Disconnect between service provision and evaluation (agencies)
Absence of evaluation protocols and/or formal processes for conducting efficacy assessments; Lack of experience, and limited access to models of best practice (practitioners)
21. 21 General observations
Agencies and practitioners agree:
Evaluation is important
Need guidelines for efficacy assessment
Profile of career services needs raising
Current evaluation priorities need revising
22. 22 Evaluation is difficult Complexity of determining and measuring outcomes
Evaluation is required by funders but there is a lack of resources and training provided
Difficult to follow-up with clients
Absence of standardized evaluation protocols and outcome definitions
23. 23 Focus Group Results Similar results for French and English language groups
Agreed that the survey results represented the state of practice of career services impact assessment in Canada.
24. 24 Focus Groups Emphasized The need to demonstrate the long term impact of services and organizational performance
The importance of unreported outcomes such as client health and well-being, economic impact
Barriers to conducting evaluations such as the lack of resources (funding, training, etc.)
25. 25 Phone InterviewsPolicy Makers evaluate by: Financial and activity outcomes
Feedback from employers and teachers
Client portfolios
Observe client outcomes
Surveys of practitioners, agencies and clients
Qualitative data
Use of program by other agencies
26. 26 Policy Makers Want From the services they fund, policy makers want:
Client outcomes
External indicators of client outcomes
Added value, e.g., economic benefits
Longitudinal evidence.
Information on how services are provided
27. 27 Need to improve evaluation Policy Makers agree on a need to improve evaluation procedures
19 suggestions re: how to improve evaluations, including:
Better understanding of core concepts
Demonstrating cause and effect
Ability to compare outcomes between service providers
Improved outcome measurement procedures
Measurement of competencies of practitioners.
28. 28 Phone Interview: Employers want From career services, employers want:
Skilled, committed, motivated, employees
Increased productivity
Reduced turnover
Improved employee mobility
Feedback about services
29. 29 Need to Improve Evaluation Employers made 26 suggestions as to how evaluations can provide the information they need, including:
Better identification of outcome indicators
Employees commitment to career program and their progress
Company exit evaluations
Quantitative and longitudinal data
30. 30 Sound Familiar?
Employers, agencies, practitioners, and policy makers made similar suggestions for improving the evaluation of career services
31. 31 What Did We Learn? Agencies, practitioners, policy makers and employers agree:
impact assessment is important
Current evaluation practices are inadequate
Important outcomes are not measured and reported
Want sophisticated evaluation procedures
The importance of evaluation is related to the type of organization providing services
32. 32 What Else Did We Learn? Employers want evaluation measures relevant to the workplace
Employee turnover
Improved internal employee mobility
Increased productivity
33. 33 Implications Demand for information and training regarding:
Effective evaluation procedures
Sophisticated evaluation procedures, e.g., differential, longitudinal, and cumulative impact of interventions.
Evaluation needs to define and demonstrate a variety of outcomes at a number of levels
Individual
Organizational
Societal levels
34. 34 Implications (Contd) Increased system support:
Training support for practitioners and agencies
Financial support for the development of processes and the collection of efficacy data
Communication between stakeholders
35. 35 Implications (Contd)
Need for ongoing research in this area
Need clear definitions to allow for comparison of results across studies
Need improved processes and procedures
Career development services in Canada have perceived value, but
there is a need to demonstrate these outcomes.
36. 36 To Demonstrate Outcomes, We Need to Develop
Evaluation tools and methods
To capture outcomes
Culture of evaluation
Identification of outcomes is an integrated part of providing services
Measuring and reporting outcomes is integrated into practice
Outcome assessment is a prominent part of counsellor education
Reporting outcomes is a policy priority
This needs to be a priority in all sectors
37. 37 CRWG Future Plans Create an ongoing agenda of research and development in the area of career services outcome assessment
Secure a stable funding base
38. 38 Possibilities
Continue stakeholder dialogue
Invite client input
Develop comprehensive models for impact assessment
Develop clear valid and reliable tools
Create a means for disseminating impact assessment information nationally and internationally
39. 39 Draft Framework for Evaluation Intended for
Discussion
Feedback
Would this work for you?
How would this fit in your work place?
40. 40 General Approach to Evidence-Based PracticeDraft for Discussion Input ? Process ? Outcome
41. 41 General Approach to Evidence-Based PracticeDraft for Discussion Input ? Process ? Outcome
42. 42 General Approach to Evidence-Based Practice Input ? Process ? Outcome
43. 43 General Approach to Evidence-Based Practice Input ? Process ? Outcome
44. 44 General Approach to Evidence-Based Practice Input ? Process ? Outcome
45. 45 General Approach to Evidence-Based Practice Draft for Discussion Input ? Process ? Outcome
46. 46 Measuring the Impact of Career Services: Current and Desired Practices CRWG
Vivian Lalande , Bryan Hiebert, Lynne Bezanson, Kris Magnusson, Robert Baudoin, Bill Borgen, Liette Goyer, Guylaine Michaud, Céline Renald, Michel Turcotte