1 / 42

Effective methods for the use, creation, analysis, and interpretation of short-answer student conceptual evaluations.

Effective methods for the use, creation, analysis, and interpretation of short-answer student conceptual evaluations. Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University. What was I thinking?.

ziva
Download Presentation

Effective methods for the use, creation, analysis, and interpretation of short-answer student conceptual evaluations.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effective methods for the use, creation, analysis, and interpretation of short-answer student conceptual evaluations. Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

  2. What was I thinking? • I’ll paint your house and walk your dog as well.

  3. In Defense of Thoughtful Multiple Choice Conceptual Assessment Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

  4. Modest Suggestions from a Chemically Illiterate Physicist. Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

  5. Curriculum Development Educational Research Computer Tool Development Teacher & Professor Education Center for Science and Math Teaching Tufts University

  6. Funding • NSF National Science Foundation • FIPSE Fund for the Improvement of Post Secondary Education • US Department of Education

  7. Wouldn’t it be nice if teachers could understand what students know from a simple conceptual evaluation? and they knew what to do to help the student learn

  8. What use might this talk be? • If you intend to develop a chemistry concept inventory these suggestions may help you make it more useful. • If you intend to use a chemistry concept inventory these ideas should help you pick a useful one.

  9. We have spent years • Creating effective learning environments for introductory science(physics) courses (curricula, tools, pedagogical methods, group structures) • And developing methods of conceptual evaluation to measure student learning and guide our progress.

  10. Why Multiple Choice? • More easily administered to large numbers of students. • Evaluation takes less time. • Student responses can be reliably evaluated even by the inexperienced. • Can be designed to guide instruction. • With proper construction, student views can be evaluated from the pattern of answers, changes over time can be seen, frequency of student views can be measured. • Multiple choice combined with open response can help the teacher/researcher explicate the students response.

  11. Why not? • Every “good” educator knows multiple choice questions are no good. • Badly constructed multiple choice can give misleading results. • Unless very carefully constructed, multiple choice will not identify student thinking. • The choices may be inappropriate when used with different audiences

  12. First steps • Why do you want to make (use) a conceptual evaluation? • In what conceptual area do you want to know how students think?

  13. Why? • There are pre-requisite areas of conceptual knowledge that students need to know to actually understand chemistry.

  14. What? Three modest suggestions. • Explore student beliefs in the atomic nature of matter. (students may say atoms exist but few believe it in any functional matter) • Explore student beliefs the dynamic nature of equilibrium. (Most students seem to have a static model) • Explore student beliefs about the difference between heat energy and temperature. (Most students do not clearly make this distinction.)

  15. Our research has shown. • Student conceptual responses can be context dependent. • Student domains of applicability can be different from those of a scientist. • Students (and scientists) can hold apparently inconsistent views simultaneously. (and it doesn’t mean they are stupid.) • Conceptual transitions are not instantaneous. • There is statistical evidence of a hierarchy of student conceptual views. • You can do more with large-scale conceptual evaluation than just generating a single number.

  16. Good Practice for the Construction of Conceptual Multiple Choice • All answers, "right or wrong," should help evaluate student views. • Derive the choices in the questions from from student answers to free response questions and from student interviews. • Check to see students almost always find an answer that they are satisfied with. Random answers should be few. • Ask similar questions in different representations. • Check results with different student populations. (more)

  17. Good Practice (continued) • Look at correlations among questions and use patterns to understand student thinking. • Understand the implications of “correct” and “incorrect” answers to their performance on other tasks. • Check for gender differences • Identify circumstances for “false positive” answers • If at all possible, construct the evaluation so it is useful to guide instruction.

  18. Multiple Choice Conceptual Evaluation • Conceptual evaluation for • kinematics (description of motion) and • dynamics (force and motion which is well characterized by Newton’s Laws). Force & Motion Conceptual Evaluation (FMCE) • Conceptual evaluation for heat energy and temperature The Heat and Temperature Conceptual Evaluation (HTCE) Both developed by the Center for Science and Math Teaching at Tufts

  19. Using the FMCE as an example • Student answers correlate well (well above 90%) with written short answers in which students explain the reason for their choices • Almost all students pick choices that we can associate with a relatively small number of student models. (Conceptual Dynamics, R.K. Thornton in ICUPE proceedings edited by Redish) • Testing with smaller student samples shows that those who can pick the “correct” graph under these circumstances are almost equally successful at drawing the graph correctly without being presented with choices.

  20. FMCE as example • Because we are able to identify statistically most student views from the pattern of answers (and because there are very few random answers), we are also able to identify students with less common beliefs about motion and follow up with opportunities for interviews or open-ended responses to help us understand student thinking. • The use of an easily administered and robust multiple choice test has also allowed us and others to track changes in student views of dynamics and to separate the effects of various curricular changes on student learning.

  21. FMCE as example • Use multiple representations • The Force Graph questions require explicit knowledge of coordinate systems and graphs but require little reading. • The Force Sled questions use natural language and make no explicit reference to a coordinate system or graphs.

  22. Comparison with short answer • As with all the questions on the test students who answered correctly were also able to describe in words why they picked the answers they did. • Statistically one of the last questions to be answered in a Newtonian manner is the force on a cart rolling up a ramp as it reverses direction at the top (question 9).

  23. Back to best practices. Consider • All answers, "right or wrong," should help evaluate student views. • Derive the choices in the questions from from student answers to free response questions and from student interviews. • Check to see students almost always find an answer that they are satisfied with. Random answers should be few. • Look at correlations among questions and use patterns to understand student thinking.

  24. An example from the H&T Conceptual Evaluation • Distinguishes different student models for the relationship between heat and temperature.

  25. Results by category

  26. What about 1 number results • Not my favorite, but useful in some situations • Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI

  27. Still one number • Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI

  28. Correlation Coefficient0.791

  29. Correlation Coefficient 0.8309

  30. Are the evaluations the same? • Yes? Very high correlations (about 0.8 pre and post with different instructional methods) • Yes? A high score on one implies a high score on the other. • No? FCI fractional scores are almost always higher than FMCE scores • No? Evaluations are measuring different things • No? A low score on the FMCE (non-Newtonian student) does not imply a low score on the FCI • Lets look at a group of non-Newtonian students

  31. The conceptual threshold effect(looking at pre-post correlations)

  32. Pre/Post Evaluation--The Threshold Effect Tufts University Calculus-based Physics (N=181) FMCE Post vs. Pre 1.00 0.80 0.60 0.40 Spring 1994 (N=48) 0.20 Spring 1995 (N=37) Spring 1997 (N=43) Spring 1998 (N=53) 0.00 0.00 0.20 0.40 0.60 0.80 1.00 Before Instruction

  33. Force Acceleration Velocity 0 20 40 60 80 100 University Physics Courses Before Instruction Average College and University Results Before Instruction % of Students Understanding Concepts

  34. Force Acceleration Velocity 0 20 40 60 80 100 University Physics Courses After Normal Instruction Average College and University Results After Traditional Instruction Before Instruction % of Students Understanding Concepts

  35. Physics & Science Courses Using New Methods We have evidence of substantial, persistent learning of such physical concepts by a large number of students in varied contexts in courses and laboratories that use methods I am about to describe. Such methods also work for students who have traditionally had less success in physics and science courses: women and girls, minority students, and those who are badly prepared.

  36. Force Acceleration Velocity 0 20 40 60 80 100 University Physics Courses After New Methods Average College and University Results After New Methods After Traditional Instruc. Before Instruction % of Students Understanding Concepts

  37. Our Instructional and Assessment Philosophy “I still don’t have all of the answers, but I’m beginning to ask the right questions.”

More Related