270 likes | 442 Views
Use of the Canadian Graduate & Professional Student Satisfaction Survey: A Local Approach Joan Norris, Keith Flysak & Michael Bittle Faculty of Graduate & Postdoctoral Studies. CGPSS Intent: to investigate sources and levels of satisfaction among enrolled graduate students
E N D
Use of the Canadian Graduate & Professional Student Satisfaction Survey: • A Local Approach Joan Norris, Keith Flysak & Michael Bittle Faculty of Graduate & Postdoctoral Studies
CGPSS • Intent: to investigate sources and levels of satisfaction among enrolled graduate students • In both research-intensive and professional programs
Why measure satisfaction? • HEQCO perspective (Spence, 2009; Zhao, 2012): • Better understanding of graduate level education processes; • Comparative analyses; • Provincial & national portraits of graduate education with insights into funding, completion, institutional infrastructure & other areas of improvement; • Promoting relevant changes & appropriate adaptations to maintain a competitive international edge.
But keep in mind the limitations of the CGPSS: • Survey development not systematic: • Many sources cited (an informal group of grad deans from Rutgers, Duke, Stanford; adopted and revised by MIT, Western & G13). • Despite its origins, not often used in the U.S. • Decision rules re: category and question choice unclear (“anointed correct”) • Reliability and validity unknown (although factor analyses have been carried out). • Different versions of the survey administered, so a true-cross sectional analysis difficult.
And be cautious: • Respondents’ answers to any measure of “satisfaction” may be influenced by: • affective state; current context, future expectations, past events and social comparisons • Findings will be also affected by • Sample size restrictions, bias, missing data, rewards and incentives to participants
Our goals at Laurier: • Examine stability of positive findings regarding faculty mentoring and teaching strength; • Evidence for improvement in areas identified by first two administrations; • Opportunities & challenges in individual programs; • Provide information for cyclical reviews, integrated budgeting and planning exercise, strategic enrolment management; • Benchmarking across similarly sized institutions.
Measures and • Indices (HEQCO): • General Assessment • General Satisfaction • Benchmarks of Satisfaction
General Assessment: • How would you rate the quality of-- • your academic experience at this university? • your student life experience at this university? • your graduate/professional program at this university? • your overall experience at this university?
General Satisfaction: • If starting over, select same university? • If starting over, same field of study? • Would you recommend this university to someone considering your program? • Would you recommend this university to someone in another field? • If starting over, select same faculty supervisor?
Benchmarks of Satisfaction: • (items selected from factor analyses by G13) • Quality of Teaching (3 items) • Opportunities to Present and Publish (5 items) • Research Training and Career Orientation (9 items) • Supportive Dissertation Advisor (12 items)
Our analyses have included: • Frequencies (provided by Mosaic) • Snapshots of each administration • Development of unique indices • modelling • Cross-sectional analyses of indices • Program profiles and scorecards
Cross-sectional analyses: • Variables: • Composite general assessment index • Composite satisfaction index • Four benchmark indices: • Quality of Teaching • Opportunities to Present and Publish • Research Training and Career Orientation • Supportive Dissertation Advisor
Cross-sectional analyses (2007, 2010, 2013) with comparisons to mid-size and consortium groups (one-way ANOVAS with post-hoc comparisons) • Response rates: approx. 40% for research intensive programs & 25% for professional programs at each administration • Separately for master’s and doctoral students: • Could not separate master’s because of changes to the survey • 2007: with/without thesis • 2010: regular/professional • 2013: research & coursework streams/professional
Snapshot results: • Areas of strength include benefits of a small institution: high quality faculty mentoring and teaching • Implications for expansion • Areas of need included extracurricular training opportunities • Development of professionalization suite of workshops, seminars, courses (ASPIRE) • Co-curricular record
Cross-Sectional Results: • Satisfaction ratings consistent with same-sized universities in the consortium • Overall high quality maintained in the context of rapid expansion (doubling of programs)
Differences in “general” measures often difficult to detect • Persona and Program profiles and scorecards may be more useful : • Contribute to Strategic Enrolment Management project • Developed persona groups: professional master’s, research intensive master’s, doctoral • Individual program results provide insights into quality enhancement.
Supplemented satisfaction scores with: • Student demographics (e.g., age, citizenship/visa status, gender, Canadian geographic area (KW, rest of ON, QC, East, West, North) • Nonenrolment survey results • Admissions conversion scores: efficiency rates
Analyses of Variance in Persona Groups • Using Satisfaction Indices
Research master’s persona group: • Supervision satisfaction strengthening; publishing/presenting opportunities need more attention
Doctoral persona group: • Significant improvement in Student Life & Quality; • Supervision strong; presentation/publication opportunities need attention.
Final thoughts: • Pressure to assess student views of their graduate experience will remain; • Satisfaction surveys provide one useful, but limited, means of assessment; • CGPSS will continue to develop as an assessment method; • Benchmarking may be helpful, but within-institution scorecards more likely to lead to quality improvement.
Some things do improve over time! • They look pretty satisfied…