1 / 21

Developing Assessment Instruments

Develop & implement criterion-referenced tests to measure explicit performance objectives & instructional effectiveness. Learn to design tests for different learning domains & determine mastery levels for various tasks. Understand the importance of writing clear and appropriate test items while evaluating test items for reliability and validity.

jasont
Download Presentation

Developing Assessment Instruments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Assessment Instruments Dick & Carey Chap. 7

  2. Criterion-Referenced Tests • Designed to measure explicit behavioral objectives • Used to evaluate: • learner performance • effectiveness of the instruction

  3. Criterion-Referenced • Also called objective-referenced • Refers directly to explicit “criterion” or specified performance • “Criterion-Referenced Test” must: • match test item and performance objective • stipulate degree of mastery of the skill

  4. Types of Criterion Tests Pretest: • 1. Consists of items that: • measure entry behavior skills • test skills to be taught • draw from skills below the entry behavior line • 2. Helps determine appropriateness of required entry skills. • 3. Used during formative evaluation process. May be discarded in final version of instruction.

  5. Types of Criterion Tests Posttest: • 1. Assesses all the objectives, focusing on terminal objectives • 2. Helps identify ineffective instructional segments • 3. Used during the design process and may be eventually modified to measure only terminal objectives

  6. Designing Tests for Learning Domains • Intellectual & Verbal Information • paper & pencil • Attitudinal • state a preference or choose an option • Psychomotor • performance quantified on checklist • subordinate skills tested in paper-and-pencil format

  7. Determining Mastery Levels • Approach # 1 • mastery defined as level of performance normally expected from the best learners • arbitrary (norm-referenced) • Approach # 2 • defined in statistical terms, beyond mere chance • mastery varies with critical nature of task • example: nuclear work Vs. paint a house

  8. Writing Test Items What should test items do? • Match the behavior of the objective • Use the correct “verb” to specify the behavior • Match the conditions of the objective

  9. Writing Test Items How many test items do you need? • Determined by learning domains • Intellectual requires three or more • Wide range use random sample

  10. Writing Items (continued) What types (true / false, multiple choice, etc..) to use? • clues provided by the behavior listed in the objective • review “Types of Test Items” this chap. p 148

  11. Writing Items (continued) Item types tempered by: • amount of testing time • ease of scoring • amount of time to grade • probability of guessing • ease of cheating, etc. • availability of simulations

  12. Writing Items (continued) What types are inappropriate? • true / false for definition • discrimination, not definition Acceptable alternatives from “best possible” • for simulations • list steps

  13. Constructing Test Items Consider: • vocabulary • setting of test item (familiar Vs. unfamiliar) • clarity • all necessary information • trick questions • double negatives, misleading information, etc.

  14. Sequencing Items Consider clustering by objective Test Directions Clear and concise General Section specific Evaluating Tests / Test Items Other Factors

  15. Measuring Performance, Products, & Attitudes • Write directions to guide learner activities • and

  16. Evaluating Performance, Products, & Attitudes • Construct an instrument to evaluate these activities • a product, performance, or attitude • Sometimes includes both process and a product • For example -- TRDEV 518

  17. Test Directions for Performance, Products, & Attitudes • Determine the • Amount of guidance? • Special conditions • time limits, special steps, etc. • Nature of the task (i.e., complexity) • Sophistication level of the audience

  18. Assessment Instruments for Performance, Products, & Attitudes • Identify what elements are to be evaluated • cleanliness, finish, tolerance of item, etc. • Paraphrase each element • Sequence items on the instrument • Select the type of judgment for rater • Determine instrument scoring

  19. Formats for Assessments of Performance, Products, & Attitudes • Checklist • Rating Scale Frequency Counts • Etc.

  20. Performance, Products, & Attitudes -- Scoring Guidelines?

  21. Evaluating Congruency • Skills, Objectives, & Assessments should refer to the same behaviors • To check for congruency • Construct an Congruency Evaluation Chart • include: Subskills, Behavioral Objectives, & Test Items

More Related