180 likes | 445 Views
Developing Surveys for the Outcomes Assessment Process. Kim Anderson Course Evaluation Subcommittee Chair Summer 2009. What is a Survey?. NOUN: pl. sur·veys (sûrv ) A detailed inspection or investigation. A general or comprehensive view.
E N D
Developing Surveys for the Outcomes Assessment Process Kim Anderson Course Evaluation Subcommittee Chair Summer 2009
What is a Survey? NOUN: pl.sur·veys (sûrv ) A detailed inspection or investigation. A general or comprehensive view. A gathering of a sample of data or opinions considered to be representative of a whole. The process of surveying. (From the American Heritage Dictionary) Assessment instrument that measures a characteristic or attitude that ranges across a continuum of values or identifies a value or belief on a rating scale. Typically use a sampling of data to be representative of a whole that is being studied.
“Surveying for Surveying’s Sake” Is Problematic Distinct and Practical Purpose • If not distinct then will get frequent changes in the survey • If not practical then will get fatigue from all involved and poor expenditure of resources • Survey fatigue • Bureaucratic fatigue • Assessment fatigue • Audience fatigue • Low quality • Indirect vs. direct assessment issue • “One shot” assessments are less valuable than continuous assessments If any issues arise or persist: Do not use a survey
PurposePreliminary Planning • Confronted with a need for information (questions it should answer) • Be specific, clear-cut, and unambiguous as possible with needed information (focus) • Best possible way to ascertain desired information • Write as few questions as possible to obtain information • Trade-offs exist
Survey Development • Step 1: Decide to whom and how the survey will be administered. • Step 2: Determine the content and wording of each question. • Step 3: Determine the structure of response to each question. • Step 4: Establish the sequence of questions. • Step 5: Design the format and appearance.
Step 1: Decide to whom and how the survey will be administered. • Sample Size • General population/All • Sample = Portion of a population of interest (scientifically chosen or reliable projection or randomly selected) • Collection of Data • In-person, mail, e-mail, phone (paper surveys assume literacy and are time consuming to manage) • Optical Mark Reader, e.g. use a No. 2 pencil (requires a machine to read answer sheets) • Web-based, e.g. Survey Gizmo (online surveys are convenient, but often assume respondents have access to a computer, are technologically literate, and feel comfortable responding in an electronic format) • Professional & financial resources available
Step 2: Determine the content and wording of each question. • Appropriateness based on purpose • Eliminate unnecessary questions • Is this a double-barreled question that should be split or eliminated? • Can the respondent answer this question? (too long ago or worded in a way that might sway) • Will the respondent answer this question? (personal) • Appropriate wording • Not too vague or confusing • Avoid double negatives • Unfamiliar terminology (lingo) • Loaded terms (sensitive/controversial questions)
Step 3: Determine the structure of response to each question. • Open-ended: Not one definite answer; answer in their own words; requires the necessary time and effort to answer; yields quotable material; difficult to analyze; factor in time and effort for data compilation • Closed-ended: finite set of answers to choose; easy to standardize; data gathered lends to analysis; more difficult to write (must design choices including all possible answers)
Closed-Ended Types • Likert scale: How closely feelings match the statement on a rating scale • Multiple choice: pick the best answer(s) from the finite options • Ordinal: Rank ordered for all possible answers; rate in relationship to others • Categorical: Possible answers are in categories; respondent must fall into exactly one • Numerical: Answer must be a real number
Responses to QuestionsGeneral Suggestions • Scale Point Proliferation: Too many points on a rating scale (more than 5) is confusing and hairsplitting • Order of Categories: Better to list a progression between a lower level to a higher • Category Proliferation: Minor distinctions among categories are not useful; brevity • “Other”: With a few exceptions, avoid this option
Step 4: Establish the sequence of questions. • First Part = easier questions (gains cooperation) • Middle Series = most important topics • End of the survey = demographic and other classification questions • Conclude with a thank you.
Step 5: Design the format and appearance. • Attractive, clearly printed, and well laid out • Appealing and simple to complete • Quality engenders better response • Representing program and the college
No Survey is Perfect • Fallacy of Perfection • Ask for feedback in each step of the development process • Ask colleagues both in and out of the program or discipline for reactions and suggestions • Beta test • Many GREAT surveys have “crashed and burned” in prior revisions; just be patient • Administration • Cover Letter or script to provide consistency • Address protection of confidentiality
Surveys and Outcomes Assessment • Survey creation is the beginning of the process • Consider analysis requirement (statistical or otherwise) during survey development Types of statistical analyses: ✓ Descriptive statistics (means, medians, etc.) ✓ Correlation analysis ✓ Regression and logistic regression ✓ Graphs: Bar, Boxplots, ANOVA, etc. • Keep it simple—present basic, descriptive data regularly; More nuanced analysis is possible if there is a need to ✓ demonstrate differences (ANOVA, t-test) ✓ demonstrate correlation (basic spearman’s correlations) ✓ explain causation (regression) • Resources needed and available • Responder anonymity and data confidentiality • Key findings = presentation plan to improve service and student learning
Surveys and SLOs • Survey assessments are considered indirect assessments. Therefore, it is best to compare findings with direct assessments of student learning. • Survey assessments can be very useful for observing ✓ what students believe they are learning, ✓ what alumni feel that have learned, ✓ how well employers feel graduates have been prepared. • Survey assessments create very useful findings if a program is concerned about the quality of student preparation (i.e., employer, mentor, or work experience surveys). • Closed-ended questions are derived from the content knowledge. • Open-ended questions lead to qualitative analysis that can be compared with closed-ended responses. • Open-ended survey responses can also be analyzed to detect trends or concerns. Useful information can be gained through the systematic analysis of open-ended questions.
Final Thoughts • Well-crafted surveys are methods of describing opinions, or even describing changes in perceptions and attitudes • More work is involved in creating surveys and managing the survey process than usually anticipated • No survey is perfect; it is often best to combine a survey-based assessment with an assessment involving direct assessment of process or performance • That said, survey information can be useful • Questions? Thank you.