280 likes | 405 Views
Michigan Assessment Consortium Common Assessment Development Series Module 6 – The Test Blueprint. Developed and Narrated by. Bruce R. Fay, PhD Assessment Consultant Wayne RESA. Support.
E N D
Michigan Assessment ConsortiumCommon Assessment Development SeriesModule 6 –The Test Blueprint
Developed and Narrated by Bruce R. Fay, PhD Assessment ConsultantWayne RESA
Support The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with …
In Module 6 you will learn about • Test blueprints…what they are and why you need them • The components of a test blueprint • Criteria for a good test blueprint • Test blueprint example
“If you don't know where you're going, any road will take you there.” George Harrison (1943 - 2001) "Any Road", Brainwashed, 2002
Assessment with a Purpose Educational assessment is not something incidental to teaching and learning. It is an equal partner with curriculum and instruction. It is the critical “3rd leg” through which both students and teachers receive feedback about the effectiveness of the teaching and learning process in achieving desired learning outcomes. Assessment closes the loop.
Purposeful Assessment • Requires thoughtful alignment – ensuring that the items on a test fairly represent the… • Intended (curriculum) and actual (instructional) learning targets • Relative importance of those targets • Level of cognitive complexity associated with those targets
Useful Feedback • Also requires tests that are… • Reliable (consistent; actually measure something) • Fair (Free from bias or distortions) • Valid (contextually meaningful or interpretable; can reasonably support the decisions we make based on them)
Test Blueprints, Big Picture Are a simple but essential tool test developers use to design tests that can meet the preceding requirements Define the acceptable evidence to infer mastery of the targets
Test Blueprints, Details Explicitly “map” test items to: Learning Targets Levels of Complexity Importance
Decision-making without data… is just guessing.
But confidently using unsubstantiated data invalidly… is even worse.
Learning Targets GLCEs and HSCEs
Learning Target Details Structured Hierarchical Framework
Taxonomies For Cognitive Complexity
Bloom’s Cognitive Domain (Revised 2001)Marzano’s Dimensions of Thinking (1989)Webb’s Depth of Knowledge (1997) Bloom Marzano Evaluating Integrating Generating Analyzing Organizing Gathering • Creating • Evaluating • Analyzing • Applying • Understanding • Remembering Webb • Extended Thinking • Strategic Thinking • Skill/concept use/application • Recall
Putting it all together…A Basic Test Blueprint Table (matrix) format (spreadsheet) Rows = learning targets (one for each GLCE, HSCE, etc.) Columns = levels of cognitive complexity (one column for each level to match the taxonomy you chose to use) Cells = number of items and points possible
Summary Information • Number of items and points possible: • Row Margins = for that target • Column Margins = for that level of complexity • Lower Right Corner = for the test
Is this reasonable?Rule of Thumb Criteria… At least 3 items per target (5 is better) for reliability Appropriate distribution of items over targets (1 & 2 appear to be more important than 3 & 5) Levels of complexity are appropriate for targets and instruction Appropriate distribution of items over levels of complexity (all items are NOT at the lowest or highest level)
Limitations… Shows total points for each target/level combination, but not how those points apply to each item Doesn’t show item types Doesn’t indicate if partial credit scoring can/will be used (but may be implied) But…it was easy to construct, is still a useful blueprint, and is much better than not making one!
Some added sophistication… • It is also useful to keep track of item types / formats to ensure: • Appropriate match to learning targets and associated levels of complexity • Balanced use within tests and across tests over time • Track on same or separate spreadsheet
Common item types include… • Selected-response (multiple-choice) • (Module 7) • Constructed-response • Brief (fill-in-the-blank, short answer, etc.) • Extended (outline, essay, etc.) • (Module 9) • Performance • (Module 8)
Other item types include… Matching Sort/arrange a list in order Projects
Complexity vs. Utility Your test blueprint could get complicated if you try to account for too much in one spreadsheet. Make sure your test blueprint covers the basics, is not a burden to create, and is useful to you The following example is slightly more sophisticated, but still workable
Conclusions • Destination (purpose) • Road Map(test blueprint) • Alignment of items & item types to… • learning targets (curriculum /content) • size (complexity) of targets • cognitive level of targets • relative importance of targets
Next Module Modules 1 – 6: Intro and Overview Modules 7 –11: The Nitty Gritty Modules 12 – 18: Making Sure it Works and is Useful