90 likes | 102 Views
This study focuses on rewriting midterm exams with high-level questions to improve Science instruction. The methods involved borrowing from PDE and NAEP resources, emphasizing Chemistry content and science processing skills, and analyzing student responses. Results show positive feedback and areas needing curriculum emphasis, such as experimental design and interpreting graphs. Conclusions highlight student misunderstandings and the need for improved science process skills.
E N D
Catalytic Activity-Rewriting the Midterm Examination for Physical Science And Using Item Analysis to Detect Student Misconceptions and Improve Instruction
Teaching is a complex profession. It requires constant learning and continual reflection. (Lin, H., J. Chem Educ. 2005, 82 (10), 1569.) • GOALS • To rewrite the midterm examination to include higher-level questions that would more closely resemble expected PSSA Science assessment • To distribute the test to other grade level Science teachers for their use • To analyze student responses to questions in order to improve future instruction
METHODS • Resource material for constructing test questions for my midterm came from 2 main sources. • Some questions were used verbatim. More often, the style and type of question was emulated. • PDE Website – PSSA sample questions 2006-2007Gr11ScienceItemSampler.pdf • NAEP Website – National Assessment from the US Department of Education NAEPpdf.pdf
METHODS • The test was constructed in order to assess the Chemistry content taught in the first half of the school year • Emphasis on eligible content • Emphasis on science processing skills • Greater use of graphs, tables and descriptions of experiments to improve link between Nature of Science and science content
RESULTS • Midterm(Answer Key) • Positive feedback from other grade level teachers using the new midterm • Evoked discussion on content that we need to emphasize especially Nature of Science • Informed teachers on inadequacies in student understanding of some topics
RESULTS • Each question was evaluated to determine the percent of students that answered correctly (Sample Data) Questions that more than half of all students answered incorrectly were noted as areas requiring greater curriculum emphasis
CONCLUSIONS • Major misunderstandings of students • Experimental design (Q# 2-4) • Mixed up phase of matter descriptions (Q# 22 & 25) (Decoding problem?) • Interpreting solubility graph of gas (Q# 27) • Reason for ice density less than liquid water (Q# 36) • Interpreting graphs of phase changes (Q# 35) Temperature of water boiling
CONCLUSIONS • Students need more practice with interpreting graphs. My hypothesis is that the students have a basic understanding of the curricular content, but are not able to evaluate the summary of data in graphical form. The students are not making the connection between what they know and a symbolic representation of the knowledge.
CONCLUSIONS • There should be more overall emphasis on science process skills in the curriculum. Students lack understanding of experimental design. Inquiry investigations would be beneficial.