250 likes | 263 Views
Understand assessment basics, types & methods of assessment, assessment data, and limitations. Explore using rubrics and data processing effectively. Learn about knowledge, attitude, and performance assessments. Dive into Bloom's Taxonomy and topics at varying levels of complexity.
E N D
Assessing student learning Ross Nehm Associate Professor College of Education and Human Ecology The Ohio State University
Assessment basics • Types of assessment • Methods of assessment • Assessment data • Rubrics and data processing • Limitations • Questions
Assessment basics • As Susan has discussed, learning goals should be closely tied to assessment strategies. • As you design your I3Us, ask yourself what types of assessment strategies would best evaluate whether or not your learning goals have been met.
Assessment categories • Three commonly used types of assessments include: • Knowledge assessments • Attitude and belief (and affective) assessments • Performance assessments
Knowledge assessments • Attempt to measure what students know at varying levels of complexity: • Are students able to recall a particular piece of knowledge? • Are students able to solve a higher-order problem in a particular knowledge domain? • It is useful to measure student abilities at different Bloom levels.
Bloom’s taxonomy • Knowledge list, define, tell, describe, identify, show, label, collect, examine, tabulate, quote, name, who, when, where, etc. • Comprehension summarize, describe, interpret, contrast, predict, associate, distinguish, estimate, differentiate, discuss, extend • Application apply, demonstrate, calculate, complete, illustrate, show, solve, examine, modify, relate, change, classify, experiment, discover • Analysis analyze, separate, order, explain, connect, classify, arrange, divide, compare, select, explain, infer • Synthesis combine, integrate, modify, rearrange, substitute, plan, create, design, invent, what if?, compose, formulate, prepare, generalize, rewrite • Evaluation assess, decide, rank, grade, test, measure, recommend, convince, select, judge, explain, discriminate, support, conclude, compare, summarize
Topics and Bloom Levels of Questions Dexter Perkins (University of North Dakota), Karl Wirth (Macalester College), Ed Nuhfer (Idaho State University), presentation at the Student Learning: Observing and Assessing Workshop.
Attitude and belief assessments • Attitudes and beliefs play an important role in science learning, and a goal of your modules may be to alter attitudes. • Do students believe that learning about genomics is important? • Do students like to perform “experiments” using bioinformatics databases?
Performance assessments • Performance assessments determine whether students are able to perform a particular task. • Are students able to download a particular sequence from NCBI in fasta format? • Are students are able to use biology workbench to perform a multiple sequence alignment? • Are students able to work collaboratively to solve a problem?
What assessments should I use? • Different learning goals will undoubtedly require different (or multiple) types of assessments. • As you develop each learning goal, think carefully about what type of assessment would best measure what you are trying to achieve.
Assessment methods • Each type of assessment (knowledge, attitude, performance) may require a different assessment method. • Methods may include: • Oral interviews • Classroom observations • Concept mapping • Predict-Observe-Explain • Paper and pencil tests
Method strengths and weaknesses • Different assessment methods have different strengths and weaknesses. • Oral interviews with students tend to be very informative, but they take large amounts of time to perform and analyze. • Concept maps are excellent tools for exploring the extent of student knowledge integration, but they are difficult to score. • Paper and pencil tests are useful for assessing student learning in large classes, but often they do not provide a rich picture of what students really know.
Examples of methods • Concept mapping
Examples of methods Interviewer: A number of mosquito populations no longer die when DDT, which is a chemical used to kill insects, is sprayed on them, but many years ago DDT killed most mosquitoes. Could you explain why many mosquitoes don’t die anymore when DDT is sprayed on them? Participant R: …Well, if at first the DDT killed most mosquitoes and now it’s not killing them [any] more, then a possible explanation would be that when they first started exposing the mosquitoes to the DDT they didn’t have any…their immune system was not that strong to fight the DDT. As time went on they developed some kind of resistance to the DDT…they passed this kind of, um, newly evolved resistance on to the next generation so…passing on this trait from generation to generation…it will start becoming stronger and if the DDT is used on them it wouldn’t kill them... Interviewer: Can you tell me a little bit more about how that [resistance] would happen, in general terms? Participant R: …I was watching the discovery channel and there was a man who said he could develop resistance to the venom of a snake…so he started to gradually use little bits of this venom and started injecting venom into his system and from time to time he would increase the amount of venom he took into his system…he got bitten by the snake and to the surprise of the doctors this man actually had some kind of resistance to that venom, in comparison to a normal person who would just die…my guess would be that at first the mosquitoes…from time to time they kept exposing them to this kind of chemical…those will develop some kind of resistance to this kind of chemical for them to survive. • Oral interview
Examples of methods • Likert-scale questions Each of the statements below expresses a feeling toward biology. Please rate each statement on the extent to which you agree. For each, you may: 1. Biology is very interesting to me. 2. I don’t like biology, and it scares me to have to take it. 3. I am always under a terrible strain in a biology class. 4. Biology is fascinating and fun. 5. Biology makes me feel secure, and at the same time is stimulating. 6. Biology makes me feel uncomfortable, restless, irritable, and impatient. http://www.flaguide.org/tools/attitude/biology_attitude_scale.php
Methods data scoring • Different assessment methods produce different types of data. • It is often necessary to process the data such that they can be used in analyses of student performance. • For example, a transcribed oral interview is not in itself very helpful. We need to process and interpret the interview. • Likewise, a concept map needs to be scored in some way.
Rubrics • Rubrics are used to define the universe of student performance, knowledge, belief, etc. and map actual student performance within this universe. • Rubrics can be constructed in many different ways: • Binary (Right/wrong) • Gradational (multistate) categories • Likert scales
Simple rubric • -1 = clear evidence of a faulty mental model of natural selection lacking key concepts with numerous misconceptions; • 0 = ambiguous evidence: some correct key concepts present, some misconceptions present, but unclear evidence whether an accurate mental model of natural selection is being employed; • 1= clear, unambiguous evidence of an accurate working model of natural selection lacking misconceptions and employing at least three key concepts. Interview
Example: • Learning goal: decrease student use of misconceptions of natural selection in evolutionary explanations • Type of assessment:Knowledge • Oral interview method • Transcribed data • Rubric developed to code data • Oral interview scoring categories -1, 0, +1 • Statistical analysis of distribution frequencies • Interpretation of pre-post course gains
Limitations • Developing an assessment instrument (or “test”) that produces meaningful results is much more complicated than most people think. • The bottom line is that most of your assessment results will only be able to be treated as “suggestive” rather than “conclusive”. Why?
Assessment limitations • Good assessment relies on: • (1) appropriate experimental design + • (2) appropriately chosen and validated instruments • Most scientists can develop an appropriate experimental design, but are not prepared to develop and validate an instrument. • Without valid and reliable instruments, it is difficult if not impossible to produce a robust conclusion.
Suggestions as you proceed • Define your I3U learning goals first • Consider the appropriateness of different types of assessments (e.g., knowledge, attitude) relative to these learning goals • Evaluate the appropriateness of possible assessment methods (e.g., interview, concept maps) • Plan how you will score and evaluate the data you collect using each method • Plan to analyze the scores statistically • Use assessments to determine if your learning goals were met. • Keep in mind the limitations of the assessment approaches you are using.