1 / 41

Setting Up Learning Objectives and Measurement for Game Design

Setting Up Learning Objectives and Measurement for Game Design. Girlie C. Delacruz and Ayesha L. Madni. Serious Play Conference Los Angeles, CA – July 21, 2012. Overview. Assessment Validity. Components of Assessment Architecture. Create assessment architecture (Your Example).

boris-pena
Download Presentation

Setting Up Learning Objectives and Measurement for Game Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Setting Up Learning Objectives and Measurement for Game Design Girlie C. Delacruz and Ayesha L. Madni Serious Play Conference Los Angeles, CA – July 21, 2012

  2. Overview Assessment Validity Components of Assessment Architecture Create assessment architecture (Your Example)

  3. What is so hard?

  4. What are some of your challenges?

  5. Passed the Game

  6. Gameplay Log data Domain

  7. Challenges We Have • Translating objectives into assessment outcomes • Purpose of assessment information • Communication between designers and educators • Game is developed—need to assess its effectiveness • Cannot change code, wraparounds

  8. How can we meet the challenge?

  9. Front-end Efforts Support Effectiveness Instructional requirements Assessment requirements Technology requirements

  10. Model-Based Engineering Design Communication Collaboration

  11. Model-Based Engineering Design z

  12. Part One Assessment Validity

  13. What Is Assessment? Assessment (noun) = Test

  14. Assessment As A Verb Process of drawing reasonable inferencesabout what a person knows by evaluating what they say or do in a given situation. = ASSESSMENT

  15. Games As Formative Assessment Formative Assessment: Use and interpretation of task performance information with intent to adapt learning, such as provide feedback. (Baker, 1974; Scriven, 1967).

  16. Games As Formative Assessment Games as Formative Assessment: Use and interpretation of game performance information with intent to adapt learning, such as provide feedback.

  17. What is Validity?

  18. Assessment Validity as a Quality Judgment Legal Judgment Critical Analysis Scientific Process

  19. Assessment Validity Bringing evidence and analysis to evaluate the propositions of interpretive argument. (Linn, 2010) = ASSESSMENT VALIDITY

  20. How Does This Relate to Design? • Identification of the inferences to be made. • What do you want to be able to say? • Specificity about the expected uses and users of the learning system. • Define boundaries of the training system • Determine need for supplemental resources • Translate into game mechanics • Empirical analysis of judgment of performance within context of assumptions.

  21. What do you want to be able to say about the gameplayer(s)? Player mastered the concepts. How do you know? Because they did x, y, z (player history) Because they can do a, b, c (future events)

  22. Identify Key Outcomes: Defining Success Metrics • Quantitative Criteria (Generalizable) • % of successful levels/quests/actions • Progress into the game • Changes in performance • Errors • Time spent on similar levels • Correct moves • Qualitative Criteria (Game-specific) • Patterns of gameplay • Specific actions

  23. BACKGROUND LAYER • Prior knowledge • Game experience • Age, sex • Language proficiency motion CONSTRUCT LAYER Construct, subordinate constructs, and inter-dependencies pre1 pre2 pre3 pre4 pre5 duration direction speed INDICATOR LAYER Behavioral evidence of construct FUNCTION LAYER Computes indicator value fn(e1, e2, e3, ...; s1, s2, s3, ...): Computes an indicator value given raw events and game states o2 o1 o4 o5 o6 o7 o8 o3 EVENT LAYER Player behavior and game states Game events and states (e1, e2, e3, ...; s1, s2, s3, ...)

  24. General Approach • Derive structure of measurement model from ontology structure • Define “layers” • Background: Demographic and other variables that may moderate learning and game performance • Construct: Structure of knowledge dependencies • Indicator: Input data (evidence) of construct • Function: Set of functions that operate over raw event stream to compute indicator value • Event: Atomic in-game player behaviors and game states • Assumptions • Chain of reasoning among the layers are accurate

  25. Part Two Assessment Architecture

  26. Components of Assessment Architecture • DOMAIN REPRESENTATION • instantiating domain-specific related information and practices • guides development • allows for external review • COGNITIVE DEMANDS • defines targeted knowledge, skills, abilities, practices • domain-independent descriptions of learning • TASK SPECIFICATIONS • defines what the students (tasks/scenarios, materials, actions) • defines rules and constraints) • defines scoring

  27. Cognitive Demands • What kind of thinking do you want capture? • Adaptive, complex problem solving • Conceptual, procedural, and systemic learning of content • Transfer • Situation awareness and risk assessment • Decision making • Self-regulation • Teamwork • Communication

  28. Domain Representation • External representation(s) of domain-specific models • Defines universe (or boundaries) of what is to be learned and tested

  29. Example: Math Ontologies Knowledge specifications Item specifications

  30. Task Specifications • Operational statement of content and behavior for task • Content = stimulus/scenario (what will the users see?) • Behavior = what student is expected to do/ response (what will the users do?) • Content limits • Rules for generating the stimulus/scenario posed to the student • Permits systematic generation of scenarios with similar attributes • Response descriptions • Maps user interactions to cognitive requirements

  31. Components of Computational Model

  32. Components of Decision Model Courses of Action Do nothing: move on, end task Get more evidence or information: repeat same task, perform similar task, ask a question Intervene (instructional remediation): give elaborated feedback, worked example or add scaffolding, more supporting information Intervene (task modification): new task (reduced or increased difficulty), new task (qualitatively different)

  33. Components of Decision Model Decision Factors Confidence of diagnosis : How certain are we about hypothesized causal relation? Consequence of misdiagnosis: What happens if we get it wrong? What are the implications of ignoring other possible states or causal relations? Effectiveness of intervention: How effective is the intervention we will give after diagnosis? Constraints: Do we have to efficiency concerns with respect to time or resource constraints?

  34. Part Three Assessment Architecture(Your Example)

  35. Assessment Architecture Fixed Variables Assumptions and Design Rationale Task characteristics + Context (test, simulation, game) + Person (prior knowledge and experience)

  36. Assessment Architecture Observed Event(s) Fixed Variables Performance to be Assessed What happened? (Raw data, scored information?) Task characteristics + Context (test, simulation, game) + Person (prior knowledge and experience)

  37. Assessment Architecture Judgment of performance Translation Observed Event(s) Fixed Variables What happened? (Raw data, scored information?) What does this mean? Task characteristics + Context (test, simulation, game) + Person (prior knowledge and experience)

  38. Assessment Architecture Translation Observed Event(s) Fixed Variables What happened? (Raw data, scored information?) What does this mean? Task characteristics + Context (test, simulation, game) Inferences Assessment Validation + What are the potential causes of the observed events? Person (prior knowledge and experience) Characteristics of the task? Context? Lack of Knowledge? Not sure?

  39. Potential Course of Actions No intervention Get more evidence or information Intervene Repeat Same Trial Modify Task Move On End Task Instructional Remediation Perform Similar Task Ask a question Add Scaffolding Give Elaborated Feedback More Information New Task With Reduced Difficulty Worked Example

More Related