210 likes | 230 Views
Develop & implement criterion-referenced tests to measure explicit performance objectives & instructional effectiveness. Learn to design tests for different learning domains & determine mastery levels for various tasks. Understand the importance of writing clear and appropriate test items while evaluating test items for reliability and validity.
E N D
Developing Assessment Instruments Dick & Carey Chap. 7
Criterion-Referenced Tests • Designed to measure explicit behavioral objectives • Used to evaluate: • learner performance • effectiveness of the instruction
Criterion-Referenced • Also called objective-referenced • Refers directly to explicit “criterion” or specified performance • “Criterion-Referenced Test” must: • match test item and performance objective • stipulate degree of mastery of the skill
Types of Criterion Tests Pretest: • 1. Consists of items that: • measure entry behavior skills • test skills to be taught • draw from skills below the entry behavior line • 2. Helps determine appropriateness of required entry skills. • 3. Used during formative evaluation process. May be discarded in final version of instruction.
Types of Criterion Tests Posttest: • 1. Assesses all the objectives, focusing on terminal objectives • 2. Helps identify ineffective instructional segments • 3. Used during the design process and may be eventually modified to measure only terminal objectives
Designing Tests for Learning Domains • Intellectual & Verbal Information • paper & pencil • Attitudinal • state a preference or choose an option • Psychomotor • performance quantified on checklist • subordinate skills tested in paper-and-pencil format
Determining Mastery Levels • Approach # 1 • mastery defined as level of performance normally expected from the best learners • arbitrary (norm-referenced) • Approach # 2 • defined in statistical terms, beyond mere chance • mastery varies with critical nature of task • example: nuclear work Vs. paint a house
Writing Test Items What should test items do? • Match the behavior of the objective • Use the correct “verb” to specify the behavior • Match the conditions of the objective
Writing Test Items How many test items do you need? • Determined by learning domains • Intellectual requires three or more • Wide range use random sample
Writing Items (continued) What types (true / false, multiple choice, etc..) to use? • clues provided by the behavior listed in the objective • review “Types of Test Items” this chap. p 148
Writing Items (continued) Item types tempered by: • amount of testing time • ease of scoring • amount of time to grade • probability of guessing • ease of cheating, etc. • availability of simulations
Writing Items (continued) What types are inappropriate? • true / false for definition • discrimination, not definition Acceptable alternatives from “best possible” • for simulations • list steps
Constructing Test Items Consider: • vocabulary • setting of test item (familiar Vs. unfamiliar) • clarity • all necessary information • trick questions • double negatives, misleading information, etc.
Sequencing Items Consider clustering by objective Test Directions Clear and concise General Section specific Evaluating Tests / Test Items Other Factors
Measuring Performance, Products, & Attitudes • Write directions to guide learner activities • and
Evaluating Performance, Products, & Attitudes • Construct an instrument to evaluate these activities • a product, performance, or attitude • Sometimes includes both process and a product • For example -- TRDEV 518
Test Directions for Performance, Products, & Attitudes • Determine the • Amount of guidance? • Special conditions • time limits, special steps, etc. • Nature of the task (i.e., complexity) • Sophistication level of the audience
Assessment Instruments for Performance, Products, & Attitudes • Identify what elements are to be evaluated • cleanliness, finish, tolerance of item, etc. • Paraphrase each element • Sequence items on the instrument • Select the type of judgment for rater • Determine instrument scoring
Formats for Assessments of Performance, Products, & Attitudes • Checklist • Rating Scale Frequency Counts • Etc.
Performance, Products, & Attitudes -- Scoring Guidelines?
Evaluating Congruency • Skills, Objectives, & Assessments should refer to the same behaviors • To check for congruency • Construct an Congruency Evaluation Chart • include: Subskills, Behavioral Objectives, & Test Items