1 / 29

Review: Performance-Based Assessments

Learn techniques to create performance-based assessments, set fair grading criteria, and develop cognitive assessments in real-life settings. Understand test planning, question types, and ethical grading practices. Enhance your ability to construct high-quality tests and analyze test items for reliability and validity. Improve your test design skills and ensure student learning objectives are met accurately.

Download Presentation

Review: Performance-Based Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review: Performance-Based Assessments • Performanc-based assessment • Real-life setting • H.O.T.S. • Techniques: • Observation • Individual or Group Projects • Portfolios • Performances • Student Logs or Journals

  2. Developing performance-based assessments • Determining the purpose of assessment • Deciding what constitutes student learning • Selecting the appropriate assessment task • Setting performance criteria

  3. Review: Grading Objectives of instruction Test selection and administration Results compared to standards Final grades • Grading process: • Making grading fair, reliable, and valid • Determine defensible objectives • Ability group students • Construct tests which reflect objectivity • No test is perfectly reliable • Grades should reflect status, not improvement • Do not use grades to reward good effort • Consider grades as measurements, not evaluations

  4. Cognitive Assessments Physical Fitness Knowledge Physical Fitness Knowledge HPHE 3150 Dr. Ayers

  5. Test Planning • Types Mastery (driver’s license) Meet minimum requirements Achievement (mid-term) Discriminate among levels of accomplishment

  6. Table of Specifications(content-related validity) • Content Objectives history, values, equipment, etiquette, safety, rules, strategy, techniques of play • Educational Objectives (Blooms’ taxonomy, 1956) knowledge, comprehension, application, analysis, synthesis, evaluation

  7. Table of Specifications for a 33Item Exercise Physiology Concepts Test(Ask-PE, Ayers, 2003) T of SPECS-E.doc

  8. Test Characteristics • When to test • Often enough for reliability but not too often to be useless • How many questions (p. 145-6 guidelines) • More items yield greater reliability • Format to use(p. 147 guidelines) • Oral (NO), group (NO), written (YES) • Open book/note, take-home • Advantages: ↓anxiety, ask more application Qs • Disadvantages: ↓ incentive to prepare, uncertainty of who does work

  9. Test Characteristics • Question types • Semi-objective • short-answer • completion • mathematical • Objective • t/f • Matching • multiple-choice • Classification • Essay

  10. Semi-objective Questions • Short-answer, completion, mathematical • When to use (factual & recall material) • Weaknesses • Construction Recommendations (p. 151) • Scoring Recommendations (p. 152)

  11. Objective Questions • True/False, matching, multiple-choice • When to use (M-C: MOST IDEAL) • FORM7 (B,E).doc • Pg. 160-3: M-C guidelines • Construction Recommendations (p. 158-60) • Scoring Recommendations (p. 163-4)

  12. Intrinsic Ambiguity (all foils = appealing) Extrinsic ambiguity (weak Ss miss) Too easy Figure 8.1The difference between extrinsic and intrinsic ambiguity(A is correct)

  13. Cognitive Assessments I • Explain one thing that you learned today to a classmate

  14. Review: Cognitive Assessments I • Test types • Mastery Achievement • Table of Specifications • Identify content, assign cognitive demands, weight areas • Provides support for what type of validity? • Questions Types • Semi-objective: short-answer, completion, mathematical • Objective: t/f, match, multiple-choice • Which is desirable: intrinsic/extrinsic ambiguity

  15. Essay Questions • When to use (definitions, interpretations, comparisons) • Weaknesses • Scoring • Objectivity • Construction & Scoring recommendations (p. 167-9)

  16. Characteristics of “Good” Tests • Reliable • Valid • Average difficulty • DiscriminateGotten correct by more knowledgeable studentsMissed by less knowledgeable students • Time consuming to write

  17. Quality of the Test • Reliability • Role of error in an observed score • Error sources in written tests • Inadequate sampling • Examinee’s mental/physical condition • Environmental conditions • Guessing • Changes in the field (dynamic variable being measured)

  18. Quality of the Test • Validity • CONTENT key for written tests • Is critical information assessed by a test? • T of Specs helps support validity • Overall Test Quality • Based on individual item quality (steps 1-8, pg. 175-80)

  19. Item Analysis • Used to determine quality of individual test items • Item DifficultyPercent answering correctly • Item DiscriminationHow well the item "functions“ Also how “valid” the item is based on the total test score criterion

  20. Item Difficulty 0 (nobody got right) – 100 (everybody got right) Goal=50% This allows for max item discrimination

  21. Item Discrimination <20% & negative (poor) 20-40% (acceptable) Goal > 40% + discr: incr reliability -: decr reliability

  22. Figure 8.4The relationship between item discrimination and difficultyModerate difficulty maximizes discrimination

  23. Sources of Written Tests • Professionally Constructed Tests (FitSmart, Ask-PE) • Textbooks (McGee & Farrow, 1987) • Periodicals, Theses, and Dissertations

  24. Questionnaires • Determine the objectives • Delimit the sample • Construct the questionnaire • Conduct a pilot study • Write a cover letter • Send the questionnaire • Follow-up with non-respondents • Analyze the results and prepare the report

  25. Constructing Open-Ended Questions • AdvantagesAllow for creative answers Allow for respondent to detail answers Can be used when possible categories are large Probably better when complex questions are involved • DisadvantagesAnalysis is difficult because of non-standard responses Require more respondent time to complete Can be ambiguous Can result in irrelevant data

  26. Constructing Closed-Ended Questions • AdvantagesEasy to code Result in standard responses Usually less ambiguous Ease of response relates to increased response rate • DisadvantagesFrustration if correct category is not present Respondent may chose inappropriate category May require many categories to get ALL responses Subject to possible recording errors

  27. Factors Affecting the Questionnaire Response • Cover Letter Be brief and informative • Ease of ReturnYou DO want it back! • Neatness and LengthBe professional and brief • InducementsMoney and flattery • Timing and DeadlinesTime of year and sufficient time to complete • Follow-upAt least once (2 about the best response rate you will get)

  28. The BIG Issues in Questionnaire Development • Reliability Consistency of measurement Stability reliability: 2-4 wks between administrations • Validity Truthfulness of response Good items, expert reviewed, pilot testing, confidentiality/anonymity • Representativeness of the sample To whom can you generalize?

  29. Cognitive Assessments II Ask for clarity on something that challenged you today

More Related