240 likes | 343 Views
Assessment Strategies for Vision 2020 grants. Gwynn Mettetal. G oals for this workshop:. Discuss different ways to assess outcomes Help you decide which methods would be best for your project. Your projects?. Who are you?
E N D
Assessment Strategiesfor Vision 2020 grants Gwynn Mettetal
Goals for this workshop: • Discuss different ways to assess outcomes • Help you decide which methods would be best for your project
Your projects? • Who are you? • What sort of Vision 2020 project are you planning? (the two sentence version)
Definitions: • Assessment—evidence that your project is making a difference • You MUST assess the effectiveness of your Vision 2020 grant to get continued funding! • Lots of strategies possible • Depends on your goals • Depends on your situation
Two major types of data • Quantitative (numbers) • Grades, attendance, ratings on a scale, retention rate • Qualitative (words) • Interviews, essays, open ended survey questions • Both are fine, just different
Some sources of data • Existing data (easiest, already there) • Student records • Archival data • Student work in course • Conventional sources (easy, but must generate) • Behavioral data—journals, library usage • Perceptual data—surveys, focus groups, interviews • Inventive sources (difficult) • Products or performances
Ethics • Must treat students respectfully • Must protect privacy • Must “do no harm” • Collecting new data (not coursework) from your own students? • Have someone else collect and hold until grades are in • Can’t force them to participate • Can’t take up too much instruction time • Institutional Review Board (IRB) • If planning to publish
Tips • Add power--compare groups! • Before and after • Different course units • This semester and last • Two sections with different methods • Your class to that of another instructor • Be realistic--start small
Definitions: • Validity—does your evidence (data) mean what you think it means? • Example • test scores = deep learning? • What if just rote memory? • What if students cheated?
Definitions: • Reliability—would you get the same evidence if you collected it again? Or was this just a fluke? • Example: • Test scores = deep learning? • What if you gave again next week and scores were very different?
Dilemma • In general, hard to have both. • Real life is messy (valid, not as reliable) • Experiments are controlled (reliable, not as valid) • Solution is . . .
Triangulate! • Get several different types of data • Different sources: • Instructors, students, advisors, records • Different methods: • Surveys, observations, student work samples • Different times: • Start and end of semester, two different classes, two different semesters
See if data all point to same conclusion Course evaluations final project rubric Comparison to last semester’s class
Brainstorming What data could YOU collect?
Analyze data—What did you find? • Qualitative analyses: look for themes in words and behaviors Theme 1: Students understood more abstract concepts after group discussion. (Follow with quotes from student exams, other evidence.)
What did you find? • Quantitative analyses: simple graphs, tables • Simple statistics: means, correlations, t-tests
What did you find? Focus on practical significance, more than statistical significance
Brainstorming: What would convince YOU?
Take action based on findings • If evidence was good, keep your old strategy • If evidence was weak, tinker to improve your strategy • Plan to assess again, after working with a new group of students • You will need to show how you used your data to get continued Vision 2020 funding!