180 likes | 306 Views
13 th International Conference on Knowledge Management and Knowledge Technologies ( i -KNOW) 4-6 September 2013, Graz, Austria EU-Day Synergy Workshop. Immersive Reflective Experience-based Adapive Learning.
E N D
13th International Conference on Knowledge Management and Knowledge Technologies (i-KNOW)4-6 September 2013, Graz, AustriaEU-Day Synergy Workshop Immersive Reflective Experience-based Adapive Learning Evaluation in ImREAL: Lessons Learned from a Project on Augmenting Experiential Training Simulators Christina M. Steiner, Gudrun Wesiak, & Dietrich Albert Cognitive Science Section, Knowledge Management Institute,Graz University of Technology
Overview • Intro • Evaluation in ImREAL • Challenges and Solutions • Special Types of Evaluation • Wrap Up
Intro • Evaluation is a natural part of TEL projects • Defining evaluation mechanisms and metrics at proposal stage • Interests of scientific and industrial community • Evaluation goals: • Demonstrate added value of the developed learning technologies • Collecting information for advancing technologies • Testing and evaluating in a reproducible way • Allowing comparison of research results
Simulated Environmentfor Learning Evaluation in ImREAL Affective Meta-cognitive Scaffolding Pedagogy – Use Cases Evaluation – User Trials Augmented user modelling Making Sense of Digital Traces Integration Framework
Evaluation in ImREAL Overall Goal: Designing and carrying out methodologically sound evaluations of the project outcomes and the project itself Main Objectives: • Definition of a scientifically sound evaluation methodology • User trial evaluations
Evaluation in ImREAL Evaluation phases Year 1 Year 2 Year 3 Phase 2 Phase 1 Project phases First version of services and demonstrators Second version of services and demonstrators Final versions of services and demonstrators collecting benchmark data, informing design Formative Evaluation:initial and potential benefits, informing next development cycle Formative and Summative Evaluation:quality of second version of ImREAL technology, informing final refinements in development Development phases Baseline Evaluation Phase 1 Evaluation Phase 2 Evaluation Evaluation phases
Evaluation in ImREAL • Simulator Maturation Stages • Simulator creation • Creating a new simulation learning scenario • Simulator extension • Extending an existing simulationtosupportlearning • Simulator integration • Integrating a simulator in a training environment
Evaluation in ImREAL • Test beds • Two usecasesandsimulatorswithdifferent focus • Medical interview training • Maturesimulation • Integrated in concretetrainingcontext • Focus on simulatorextension(Phase 1),broadeningtosimulationcreation (Phase 2) • Business simulationandbuddyprogram • Creationofnewsimulationscenarios • Focus on simulatorcreationin Phase 1,on simulatorextension in Phase 2
Evaluation in ImREAL • Evaluation dimensions • Aligningevaluationmethodologywith • Goals ofevaluationphase • Stakeholdersofmaturationstage • Research focusofusecase integration Simulator Maturation Process extension creation use case A Use Case use case B phase 1 eval. phase 2 eval. baseline eval. Evaluation Phase
Evaluation in ImREAL • Evaluation dimensions • Aligningevaluationmethodologywith • Goals ofevaluationphase • Stakeholdersofmaturationstage • Research focusofusecase integration Simulator Maturation Process extension creation use case A Use Case use case B phase 1 eval. phase 2 eval. baseline eval. Evaluation Phase
Challenges & Solutions • Evaluation of ImREAL services, not the simulators • Formulation of evaluation question - analyses of expected benefits of services • Combination of evaluation studies • Specifically targeted on isolated services • Integrated in test beds and simulators • Incorporation of baseline evaluations • Sound investigation of service benefits
Challenges & Solutions • Evaluation methodology vs. evaluation practice • Evaluation plans before development is finished • Feasibility • Timing • Involvement of target and external stakeholders • Small numbers of experts • Academia vs. industry
Challenges & Solutions • Evaluation of user model augmentation • Users’ acceptance of exploiting their social digital traces • Evaluation of learning effects • Difficulty to demonstrate knowledge gain • Considering learning effects and experience in a more comprehensive way • Study learning activity rather than purely subjective reports • Analysing learners’ real activities with the simulators
Special Types of Evaluation • Cost-benefit analysis • Costs of integration of ImREAL services • How can these costs be balanced by the benefit/added-value • ‘Perspective of the market’ • Internal project evaluation • Formativereviewofresearchquestionsand formative evaluationoftheproject • Evaluation approach: • Interdisciplinarydialogue • Questionnaire
Wrap-up • Evaluation should be inherent part of iterative design process • Evaluation of sub-components, as well asintegration in real-world settings • Mind the gap: Realistic evaluation planning • Mixed-method approach combining objectivedata with subjective self-reports • Evaluation as dissemination activity • Cost-effectiveness and readiness for the market • Evaluation as instrument to stimulate reflection on the project
Highlights • Evaluation as inherent part of iterative design process • Mind the gap: Realistic evaluation planning • Mixed-method approach combining objective data with subjective self-reports