1 / 25

KPIs for WP3

KPIs for WP3. Outlook to Year 4. Validation and update of contributions to the GALA roadmap. Final Thematic Workshops: Disseminate SIG outcomes Validate recommendations Feedback from domain experts Consolidation: Finalization of ongoing user studies Evaluate SGs for the VRE

dyanne
Download Presentation

KPIs for WP3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. KPIsfor WP3

  2. Outlook to Year 4 Validationand update ofcontributionsto the GALA roadmap Final Thematic Workshops: Disseminate SIG outcomes Validate recommendations Feedback from domain experts Consolidation: Finalization of ongoing user studies Evaluate SGs for the VRE Transition to the SGS Scientific dissemination

  3. Relatedto the finalthematicworkshops • #participants • New relations established, newinvolvedpartners • … • Relatedtopublications, specialissues, invitations • Relatedto the upcomingevaluationmethodology???

  4. Comprehensive SG evaluation framework in WP3 Used for the user studies in year 3

  5. A comprehensive model devised considering the outcomes emerged from the reports and according to the analysis performed in the previous years about the strategies implemented and the data collected. a long-term assessment is included by means of post game interviews and post game learning communities. Granularity: e.g. specific levels in Bloom taxonomy – not only Cognitive domain For each game a subset of the evaluation steps has been used.

  6. Learning Impact Evaluation Educational Content (domain-specific) What to measure?To verify the achievement of the expected outcomes (domain-specific) ? Educational Goal (new skills / new knowledge/ competence acquisition/ behavioural change / raise awareness … How to measure? In order to verify if the new skills/ knowledge/ have been acquired, the behavioural change took place, etc…

  7. Evaluation steps Learning Evaluation as an Impact project: preliminary work about serious game fitness in the educational program in cooperation with educators, facilitators and relevant stakeholders (e.g. HRs and CLOs in organisations) learning goals: raising awareness, training skills, supporting motivation, experience of collaborative learning, … Identification of domain-specific expected learning outcomes according to the SG Description Template(refers to Bloom’s revised taxonomy) Identification of the kind of data that can be collected, based on game deployment, on privacy and ethical settings and requirements, and on available data analysis tools / instruments Identification of the learning metrics, based on matching the learning goals with the variety of data available Set-up of the Post Game Learning Community by populating it with preliminary material for raising awareness or providing insights about the subject matter

  8. Evaluation steps • Training the trainers • Pre-game test (questionnaires / interviews) (perceived&objective) • (If Facilitated workshop) Briefing • Game session • Direct observation • (If Facilitated workshop) De-briefing • Post-game test (questionnaire / interviews ) (objective) • Just after the game session: (perceived) • Enjoyment of the game (Kirkpatrick’s Level 1) • What you learnt & to which extent, performance perception (Kirkpatrick’s Level 2) • Time (e.g. up to one year) after the game session (Kirkpatrick’s Level 2) • Technology Acceptance Model (Davis, 1986)) • Testing “on the job” / Direct Observations) (Kirkpatrick’s Level 3) • Post Game Learning Community (Kirkpatrick’s Level 3)

  9. User Studies

  10. Example – the Icura user study

  11. Evaluation steps Learning Evaluation as an Impact project: preliminary work about serious game fitness in the educational program in cooperation with educators, facilitators and relevant stakeholders (e.g. HRs and CLOs in organisations) No – INFORMAL SETTING learning goals: raising awareness, training skills, supporting motivation, experience of collaborative learning, … RAISING AWARENESS, NEW KNOWLEDGE Identification of relevant domain-specific expected outcomes according to the SG Description Template(refers to Bloom’s revised taxonomy) COGNITIVE: remembering the Japanese words for hello, goodbye, … applying the correct salutation…AFFECTIVE: valuing the principles of Japanese culture, … Identification of the kind of data that can be collected, based on game deployment, on privacy and ethical settings and requirements, and on available data analysis tools / instruments. Various Users; no in-game tools; user data = demographics, game competence, Japanese familiarity; game data= time, achievements, exploration; data about gaming experience: perceived usability, engagement, … data about the learning exp: perceived effectiveness, feedback, motivation, … (SG description Template) Identification of the learning metrics, based on matching the learning goals with the variety of data available see Table

  12. Evaluation steps • Set-up of the Post Game Learning Community by populating it with preliminary material for raising awareness or providing insights about the subject matter • Training the trainers – just game play • Pre-game questionnaires / interviews • Facilitated workshop • Briefing – just introduction • Game session • Direct observation • De-briefing • Post-game questionnaire / interviews – taking care to avoid selective attention bias • Just after the game session: • Enjoyment of the game (Kirkpatrick’s Level 1) • What you learnt & to which extent, performance perception (Kirkpatrick’s Level 2) • Time (e.g. up to one year) after the game session (Kirkpatrick’s Level 2) • Technology Acceptance Model (Davis, 1986)) • Testing “on the job” / Direct Observations) (Kirkpatrick’s Level 3) – repetition of post-test • Post Game Learning Community (Kirkpatrick’s Level 3)

  13. VRGB training for rehabilitation

  14. Evaluation steps Learning Evaluation as an Impact project: preliminary work about serious game fitness in the educational program in cooperation with educators, facilitators and relevant stakeholders (e.g. HRs and CLOs in organisations) Beyond domain-specific learning objectives: raising awareness, training skills, supporting motivation, experience of collaborative learning, … TRAINING MOTOR SKILLS, DEVELOP SELF-CONFIDENCE Identification of relevant learning goals / outcomes according to the SG Description Template (refers to Bloom’s revised taxonomy) Identification of the kind of data that can be collected, based on game deployment, on privacy and ethical settings and requirements, and on available data analysis tools / instruments Users=students; nintendowii controller; dominant hand performance as own control condition; # strokes on target Identification of the learning metrics, based on matching the learning goals with the variety of data available #strokes on target, perceived skill and self-efficacy improvement Set-up of the Post Game Learning Community by populating it with preliminary material for raising awareness or providing insights about the subject matter

  15. Evaluation steps • Training the trainers • Pre-game questionnaires / interviews self-efficacy rating questionnaire; Edinburgh Handedness Inventory; baseline assessment in RW: forearm and backhand shoots with dominant and non-dominant hand • Facilitated workshop • Briefing introductory video; Familiarization session: target shooting on game console • Game session 3 sessions of 30 mins each; with both hands; fixed and moving target • Direct observation • De-briefing • Post-game questionnaire / interviews repeated assessment in RW and self-efficacy questionnaire • Just after the game session: • Enjoyment of the game (Kirkpatrick’s Level 1) • What you learnt & to which extent, performance perception (Kirkpatrick’s Level 2) • Time (e.g. up to one year) after the game session (Kirkpatrick’s Level 2) • Technology Acceptance Model (Davis, 1986)) • Testing “on the job” / Direct Observations) (Kirkpatrick’s Level 3) • Post Game Learning Community (Kirkpatrick’s Level 3)

  16. Reviewers’ comment With regard to WP3, the priority should be on analysing what has been learned to date. A framework for evaluating SGs would be an important contribution. It might be worthwhile to apply that framework to SGs evaluated by others to see how the outcomes of the GaLA evaluation framework compare with what has already been reported in the literature. Conducting such an analysis would help to frame how the GaLA evaluation framework is different from what is being done by others, and what the GaLA evaluation framework contributes to the larger community.

  17. Framework for designing the evaluation experiment for a SG (aim: evaluating its learning impact) A toolhelpingdevelopersto set up the userstudytoevaluatetheir game.

  18. Learning Impact Evaluation Educational Content (domain-specific) What to measure?To verify the achievement of the expected outcomes (domain-specific) ? Educational Goal (new skills / new knowledge/ competence acquisition/ behavioural change / raise awareness … How to measure? In order to verify if the new skills/ knowledge/ have been acquired, the behavioural change took place, etc…

  19. Reference Models • A specific model of the domain (e.g. a taxonomy) • A validated model of learning outcomes (e.g. Bloom Taxonomy, Kirkpatrick model,…) Educational Content (domain-specific) What to measure?To verify the achievement of the expected outcomes (domain-specific) • A validated list of the possible educational goals of a SG • ???? • Validated “rules” to match expected outcomes to the right way to measure them • ??? • e.g. Kirkpatrick’s examples How to measure? In order to verify if the new skills/ knowledge/ have been acquired, the behavioural change took place, etc… Educational Goal (new skills / new knowledge/ competence acquisition/ behavioural change / raise awareness …

  20. Kirkpatrick’s model Bloom’s taxonomy Kirkpatrick’s Four Levels of Learning Evaluation Model

  21. Suggestion Try to macthoursteps & Igor’s frameworkinto an abstract, comprehensive flow Foreachstep/phase in the flow, provide a set ofvalidatedmodels and suggestionsforuse Needcontributions!!!

  22. WP3 activities in Y4 • Evalmethodology • Finalthematicworkshops • Input to the roadmap & dissemination • SG descriptions • Monitoring the field & updating web pages • NEED PRIORITIES!!!

  23. Best practices in cooperation

  24. Best practices in cooperation in WP3 Papers Associate partners (SG descriptions&papers) Liaisons (SIG3.3&games4health, SIG3.6&games4change) Finalworkshops Userstudies: CEDEP&ESADE, BIBA&POLIMI, CMRE&MAN, CNR&SERIOUSGAMESINTERACTIVE, CNR&ORT&RWTH, …

More Related