180 likes | 301 Views
How do you solve a problem like Impact? Summary of survey findings. Paul Redmond. Head of the Careers & Employability Service, University of Liverpool. Overview. 46 completers (82% heads of service with between 4 – 7 years’ service).
E N D
How do you solve a problem like Impact? Summary of survey findings Paul Redmond Head of the Careers & Employability Service, University of Liverpool
Overview • 46 completers (82% heads of service with between 4 – 7 years’ service). • Majority (85%) located in student service-related department. • 11% in a teaching-related department • 3.8% in an external-relations department • Spread of universities (post-1992, RG, etc.)
Reporting structure • 70% of heads report to Head of Student Services • 20% report to PVC • Academic deans / heads of T&L = 6.8% respectively
Defining your Service’s purpose • Providing services to students – nearly all cite ‘employability’ services to students as the primary goal, e.g. • “Improving the employability of students and graduates and providing a central point of contact to support CEIAG across the campus …”
Defining your Service’s purpose • Providing services to students – nearly all cite ‘employability’ services to students as the primary goal, e.g. • “Improving the employability of students and graduates and providing a central point of contact to support CEIAG across the campus as well as delivering centrally …”
How do you or your HEI measure the responsiveness, profile and visibility of your service with the above groups? • Mixture of internal and external feedback measures • Student numbers / record of users • Referrals from academics • Profile with senior managers • DLHE – institutional comparators (ePI) • Student satisfaction / user surveys (various) • League tables • Involvement of CES in key projects /initiatives • Access to ‘prized’ networks • Employer evaluations • Hits on website • ‘Profile and visibility’ … • Matrix
How do you record and track usage of your Service? • Central recording systems • On-line systems (e.g. Interfase) • ‘Numbers attending events’ records • Mixture of qualitative and quantitative research – headcounts to ‘mystery shoppers’ • We don’t! • Basic headcounts • Monthly MIS surveys / reports • Interview stats • Hits on website • Twitter traffic …
Other ways in which you measure your impact on employers • None (several) • Numbers of vacancies filled / quantity and quality of applications • Employer advisory board • Repeat business • Direct feedback from employers in relation to vacancies and internships
Do you or your institution measure the impact of your service or your INSTITUTION’S REPUTATION AND SUCCESS on:
How your careers service impacts on your institution’s reputation and success • Provides comparative data on graduate employment (shows where HEI stands in league tables) • Success in generating funded projects • Income generation • League Tables / DLHE – “Seen as our most important function by senior management, sadly.”) • KPI’s in reports to management • Head produces full-cost recovery study to gauge VfM • Contribution to student retention • “Return on Investment … not sure about how this might be specifically measured …” • “Nothing too explicit ..” … “None.”
What KPIs and / or targets does your Service have? • DLHE – top listed KPI • Employability P.I. • No’s of finalists / graduates seen • Increased numbers using the Service • Student satisfaction surveys • Listed in ‘Top 10’ surveys • Personal recommendation • No. of placements achieved • Matrix • Mixture of externally imposed and internally verified
Final comments … • “I am still at a loss as to how to establish meaningful and real impact measures for most of the work we are engaged in, despite working at the problem for many years! Fortunately, I am rarely required to justify our impact.” • “Be careful of using DLHE as a KPI or measure of impact.” • “Our work does not necessarily have immediate effect – it may take many weeks, months or years for a client to really take on advice. How do we measure the impact of guidance?” • “Collective approach from AGCAS in addressing this is very much welcomed.”
The Impact Paradox: By prioritising overwhelmingly services to students, which produce intangible outcomes, careers services find it difficult to measure impact. Those services which can be measured are often not those viewed as key priorities by services. • Location, Location, Location: How crucial is institutional positioning? • Across the sector, wide variations exist (and varying levels of measurement sophistication). • But impact measurement is becoming a widely accepted issue of importancefor most careers services. Nevertheless, methodologies remain limited.