240 likes | 333 Views
Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs. Jennifer R. Keup, Director National Resource Center for The First-Year Experience and Students in Transition keupj@mailbox.sc.edu. Institutional data are meaningless without a comparison group.
E N D
Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs Jennifer R. Keup, Director National Resource Center for The First-Year Experience and Students in Transition keupj@mailbox.sc.edu
Institutional data are meaningless without a comparison group.
The main outcome of interest on my campus is student development.
Goals for Today • Introduce & discuss Ory (1994) model for comparing and contrasting local vs. commercially-developed instruments • Identify elements of your institutional culture & structure that would influence decision • Discuss myths with respect to survey administration • Share examples of: • …the most prominent national surveys for first-year assessment • …software and services available to facilitate institutional assessment
What do We Mean “Off-the-Shelf” “Homegrown” • Often commercially-developed • Scope to include multiple institutions • Primarily pre-set content • Examples: • CIRP • NSSE • EBI • Developed locally • Focused on institution • Content developed and adapted by the campus/unit • Examples: • Program review • Utilization/satisfaction surveys for specific programs Continuum
Questions to Ask Who needs to see these data? What are my analytical capabilities? What is my budget? Who needs to make decisions with these data? How will this fit with my other responsibilities? What is my timeline?
Ory (1994) Model for comparing and contrasting local vs. commercially-developed instruments
Six Factor Comparison • Purpose • Match • Logistics* • Institutional Acceptance • Quality • Respondent Motivation to Return the Instrument
Purpose Why are we doing this study, and how will the results be used? “Off-the-Shelf” “Homegrown” • Allows for comparison to national norm group • Examples: • Comparison to peer or aspirant group • Benchmarking • Contextualize a broad higher education issue or mandate • Allows for a thorough diagnostic coverage of local goals and interests • Examples: • Satisfaction with campus program • Achievement of departmental goals • Program review
Match • What are the program/institutional goals, outcomes, and areas of interest? • Does an existing instrument meets my needs? • Does the survey address my purpose? • Can the existing instrument be adapted to meet my needs? “Off-the-Shelf” “Homegrown” • May provide in- complete coverage of local goals & content • Tailored to local goals and content Local Questions!
Institutional Acceptance IRB • How will the results be received by the intended audience? • Who needs to make decisions with this data? • What is the assessment culture? “Off-the-Shelf” “Homegrown” Politics • Professional quality & national use may enhance acceptance • Failure to completely cover local goals and content may inhibit acceptance • Local development can encourage local ownership and acceptance • Concerns about quality may interfere with acceptance
Quality • What is the track record of quality? • What is the psychometric soundness of the instrument? “Off-the-Shelf” “Homegrown” • Tend to have better psychometrics • Professional quality may compensate for incomplete coverage of local goals and objectives • Must fully test psychometric properties • Create a professional appearance • Lack of professional quality may affect results and institutional acceptance
Respondent Motivation toReturn the Instrument “Off-the-Shelf” “Homegrown” • What will yield the highest response rate? • Can create instant credibility • Sometimes provide institutional or individual incentives • Local specificity may yield greater respondent “buy in” • Local instruments may not “impress” people • Can create student perception of immediate impact Incentives
Logistics (10 considerations) • Availability • Preparation time • Expertise • Cost • Scoring • Testing time • Test & question types • Ease in administration • Availability of norms • Reporting The Devil is in the details!
Logistics (continued) OTS: Availability HG: Availability • Does a survey currently exist for our needs? • If you can afford it, the survey is available • “If you build it they (i.e., data) will come” • Takes time & resources to develop OTS: Prep time HG: Prep time • Short • Can take considerable time • What is the survey timeline? Is it feasible? • Have you considered administration planning?
Logistics (continued) OTS: Expertise HG: Expertise • Fully-developed protocol allows one to administer after reading manual • Takes content, measurement, and administrative experience • Psychometrics!!! OTS: Scoring HG: Scoring • Can be delayed if scoring off campus • Need to adhere to the administration cycle • Can be immediate Related to expertise
Logistics (continued) OTS: Testing time HG: Testing time • Fixed based upon content and administration protocol • Flexible as long as the survey meets institutional & programmatic needs If administering in class do you have faculty buy-in? OTS: Test type HG: Test type • Allows for flexibility in type of test (objective /open-ended) and type of question (MC, rank ordering, etc.) • Type of test and questions are predetermined
Logistics (continued) OTS: Ease of Admn HG: Ease of Admn IRB • Requires standardized administration • Special training for testers • Allows for greater flexibility OTS: Norms HG: Norms • National & inter-institutional comparison • Intra-institutional comparison OTS: Reporting HG: Reporting • Standard formats that don’t always relate to institution • Institutional tailoring of results and reporting
Logistics (continued) OTS: Cost HG: Cost • Primary costs associated with purchase price • Other costs: • Scoring • Data • Specialized reporting • Human resources to coordinate campus administration • Recurring cost • Primary costs associated with development costs • Instrument development • Ensuring psychometric properties • Scoring & recording data • Reporting findings • Other costs • Software/hardware • Training • Primarily one-time investment
Purpose Match Logistics Accept-ance Response Quality
O-T-S vs. HG Myths • You can only gather comparison data from national (OTS) surveys • It is cheaper to develop and administer a homegrown survey • Off-the-shelf surveys don’t require any work • Homegrown surveys are hard. • You don’t need IRB approval for local assessment • Off-the-shelf surveys study all the important topics
FYE Assessment Examples “Off-the-Shelf” “Homegrown” • CIRP • Freshman Survey • Your First College Year (YFCY) Survey • NSSE • Educational Benchmarking Incorporated • Services • Eduventures • Student Voice • Software • Zoomerang • Survey Monkey
Continuum of Assessment Survey Monkey Zoomerang CIRP NSSE EBI Eduventures Student Voice Off-the-Shelf Home Grown