260 likes | 386 Views
Survey Participation: A Study of Student Experiences and Response Tendencies. Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale W. Trusheim, Associate Director Office of Institutional Research & Planning University of Delaware June 1, 2005 AIR 2005 ~ San Diego, CA.
E N D
Survey Participation:A Study of Student Experiences and Response Tendencies Allison M. Ohme, IR Analyst Heather Kelly Isaacs, Assistant Director Dale W. Trusheim, Associate Director Office of Institutional Research & Planning University of Delaware June 1, 2005 AIR 2005 ~ San Diego, CA
Background • University of Delaware • Fall 2004 Enrollment • Undergraduate: 16,548 • Graduate: 3,395 • Professional & Continuing Studies: 1,295 • TOTAL: 21,238 • Doctoral/Research – Extensive
Background (cont.) High ability student body – has been increasing over the past 5 years.
IR typically surveys undergraduates each spring. Alternate between ACT Student Opinion, NSSE, or a homegrown survey. Examples of response rates: Student Opinion 1995: 30% 1998: 26% 2002: 21% Career Plans 1996: 46% 1999: 43% 2001: 37% Our Past Surveys
Declining Response Rates… Incentives? Paper v. web survey? Timing of administration? Develop a systematic study to examine these issues and their relation to poor response rates A Survey about Surveys???
Then use this information to improve student response rates of future surveys. Use focus groups and telephone interviews to discover: How many survey requests typically impact an undergraduate? What factors make students likely to respond (or not respond) to a survey? Research Objectives
Methodology – Survey Questions (see Appendix A) • Thinking back to the previous full academic year (2002-2003), how many surveys from any sources were you asked to complete at the University? What was the source of the survey(s)? • How many surveys did you complete and return? • What were the reasons that helped you decide to complete and return the survey(s)?
Methodology – Survey Questions (cont.) • What were the reasons that made you decide not to complete and return a survey? • How do you feel when you receive an unsolicited survey? What kind of impact do they have on you? • What suggestions do you have for increasing student response rates at UD?
Methodology – Initial Research Design • Random sample of: • Full-time undergraduate students • Continuing from previous academic year (2002-2003) • Contact students via telephone and ask the screening question: Have you received at least one unsolicited survey from the University in the past academic year? • If “yes”, student was invited to participate in one of five focus groups (filling ten students/group).
Methodology – Initial Research Design (cont.) • If unable to attend a focus group, the student was given the opportunity to answer the same research questions as part of our telephone survey group. • Once 50 students answered the telephone survey, this portion of the methodology was closed. • Incentive: two drawings for $100 gift certificates to use in downtown Newark.
Methodology – Adjusting the Research Design After only slight success in filling the focus groups: • Opened the study to students answering “no” to the screening question. • Drew additional sample of students who had been sent an Economic Impact Survey in Fall 2003.
Methodology – Need for an Additional Method • Low focus group attendance (even after confirmations with the participants) yielded 8 students over three groups. • Added third method: in-person interviews of students in the UD Student Center’s Food Court. • Students answered the same questions, and were given a $5 coupon redeemable in campus Food Courts.
Total Sample Total Sample over 3 methods (n=108) Focus Group Sample (n=8) Telephone Interview Sample (n=50) In-Person Interview Sample (n=50) See complete demographic breakdown in Appendix B.
Findings • In academic Year 2002-03: • 26% of respondents did not receive any unsolicited surveys in 2002-03. • 48% received 2 or more surveys. • Survey sources: • Academic departments, Honors Program, Dining Services, graduate students, etc.
Findings – (cont.) • How many surveys did students complete and return? • 66% of the 80 students who received surveys completed/returned all surveys. • 24% completed/returned some of the surveys. • 10% did not complete/return any of the surveys. ~ Remember these are the reported response rates of students who volunteered to participate in this study. It is no surprise that they are higher than typical survey response rates.
Findings – (cont.) • Reasons for completing and returning surveys: • Desire to help UD. • Survey related to students’ interest(s), or results could affect their personal experience. • Students completed both email and paper surveys when they had “free time” and the survey required minimal effort. • When approached in-person, students find it difficult to refuse, especially when receiving an instant incentive.
T-Shirt Free Meal Schoolbooks & supplies Candy Money Coupon to receive any of the above Any incentive students can accept immediately Findings – (cont.) Desirable incentives:
Findings – (cont.) • Reasons for not completing and returning surveys: • Survey not of interest to the student. • Annoyed by receiving so many and/or multiple survey requests. • Survey seemed too complicated or required too much time/effort to complete. • Impact on Students? • Most students understand surveys are a normal procedure of any university or organization. • However, students are frustrated after not seeing any changes or receiving any follow-up after completing past surveys.
Findings – (cont.) • Suggestions for increasing response rates: • Use incentives mentioned above. • Tailor survey descriptions with explicit impact statements. • Offer follow-up to announce results and impact. • Keep surveys short and requiring little effort to understand and complete. • Best time to survey = mid-semester. ~ Survey method preference (email, paper, in-person) varies by student.
Challenges in Practice • Survey administration is decentralized across campus. • Using multiple methods (paper/web based) for one study requires additional coordination. • Students already feeling “over-surveyed”. • High preponderance of SPAM in students’ UD inboxes.
Improving Response Rates • Entering Student Needs Assessment • 2001= 21% • 2003= 15% • 2004 ACT Survey= 69% 69% response rate – How did we do it?
Another Example… • Career Plans Survey • 2002= 48% • Random sample of 25% of baccalaureate recipients • 2003= 41% • Random sample of 50% of baccalaureate recipients • 2004= 50% • Sampled entire class of baccalaureate recipients
Thank you! Allison M. Ohme aohme@udel.edu Heather Kelly Isaacs hkelly@udel.edu Dale W. Trusheim trusheim@udel.edu www.udel.edu/IR