220 likes | 297 Views
Assessment Research and Tools: when, why, how?. Diane Ebert-May Department of Plant Biology Michigan State University ebertmay@msu.edu http://first2.org. The trouble with our times is that the future is not what it used to be. -Paul Valery, The Art of Poetry.
E N D
Assessment Research and Tools:when, why, how? Diane Ebert-May Department of Plant Biology Michigan State University ebertmay@msu.edu http://first2.org
The trouble with our times is that the future is not what it used to be. -Paul Valery, The Art of Poetry
Please write responses on card Q2. What is assessment? Q1. What is scientific teaching ?
Active is participation to learn (accomplish goals) Assessment is evidence Diversity is science for all... Q1. What is scientific teaching?
Q2. What is assessment? • Data collection with the purpose of answering questions about… • students’ understanding • students’ attitudes • students’ skills • instructional design and implementation • curricular reform (at multiple grainsizes)
Q3. Why do assessment? Improve student learning and development. Provide students and faculty substantive feedback about student understanding. Challenge to use disciplinary research strategies to assess learning.
So what are the issues? Claim: Faculty need to change their teaching. Why: Data indicate students are not learning science/math. Therefore: ... if faculty implement effective scientific teaching, ...data will show learning gains by all. Faculty change?
True or False? Assessing student learning in science parallels what scientists/do as researchers.
Description: -What is happening? Cause: -Does ‘x’ (teaching strategy) affect ‘y’ (understanding)? Process or mechanism: -Why or how does ‘x’ cause ‘y’? Parallel: ask questions
We collect data to find out what our students know. Data helps us understand student thinking about concepts and content. We use data to guide decisions about course/curriculum/innovative instruction Parallel: collect data
Parallel: analyze data • Quantitative data - statistical analysis • Qualitative data • break into manageable units and define coding categories • search for patterns, quantify • interpret and synthesize • Valid and repeatable measures
Ideas and results are peer reviewed - formally and/or informally. Parallel: peer review
What did students learn? (assessment data) Why did students respond a particular way? (research) Significant question? What are the working hypotheses? Relevant theory.. What has already been done? Literature says... How and why select methods? Direct investigation... How to analyze and interpret data? What do the results mean? Coherent reasoning... Are findings replicable and generalizable? Critique by peers... Guidelines for thinking about research...
System Model
IRD Team at MSU Janet Batzli - Plant Biology [U of Wisconsin] Doug Luckie - Physiology Scott Harrison - Microbiology (grad student) Tammy Long - Plant Biology Deb Linton - Plant Biology (postdoc) Rett Weber - Plant Biology Heejun Lim - Chemistry Education Duncan Sibley - Geology Rob Pennock - Philosophy Charles Ofria - Engineering Rich Lenski - Microbiolgy *National Science Foundation