1 / 26

Paul J. Reber, Ph.D. Northwestern University Feb 20, 2019

CERES: Cognitive Expertise through Repetition-Enhanced Simulation HPT&E Technical Review Decision Making & Expertise Development. Paul J. Reber, Ph.D. Northwestern University Feb 20, 2019. Objective.

irobinson
Download Presentation

Paul J. Reber, Ph.D. Northwestern University Feb 20, 2019

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CERES: Cognitive Expertise through Repetition-Enhanced SimulationHPT&E Technical ReviewDecision Making & Expertise Development Paul J. Reber, Ph.D. Northwestern University Feb 20, 2019

  2. Objective • Accelerate the development of expertise in a cognitive skill by training with large numbers of repetitions in simulation-based protocol • Target domain: Reading and understanding topographic maps

  3. Background • Training typically provides very few repetitions of connecting a complex topographic map to the world around the trainee • Rules are explained, instructed in classroom • Expertise developed slowly by experience • Research Foundations: implicit learning depends on practice • Eventually produces automatic, habit-like execution of learned skills • Applications: land navigation, decision-making from topographic features

  4. Overall Approach • Develop map training protocol with naïve NU community participants • Underway • Field test with military personnel • Software testing, SME feedback • NU NROTC personnel • Field testing (e.g., Quantico) • Quantify training benefit on orienteering assessment within Land Navigation training

  5. Training Approach • Procedural content • Random world surfaces • Topographic map/video pairings • STATE software (Charles River Analytics, Neihaus) • Training protocol • Identify facing orientation on map from video • 30s trials with feedback • Repeated experience with map until understood • Assessment approach • Pre/post training, 10 trials no feedback

  6. Map Training • Explicit Training • Pretest • High-repetition Training • Advance to new map when accuracy is high • Increase difficulty of task • Posttest • Questionnaire

  7. Explicit Training Module • ID terrain features with topographic map • Modeled after APACTS (SoarTech) • Learning Objective 10

  8. High-Repetition Map Training Where are you facing on the map? First-person View Topographic Map

  9. Response Phase

  10. Feedback Phase

  11. Simultaneous Task Module • Maps and videos on screen at same time • Response and Feedback while terrain in view

  12. Sequential Task Module • Map and video presented separately • Must hold terrain info in working memory Study Compare Decide Feedback

  13. Sequential Task Module Study Compare Decide Feedback

  14. Project Technical Progress • Pilot testing (NU community) • 4 rounds, tuned difficulty, refined measurement and assessment approach • Military personnel testing • NU NROTC contact • IRB protocol under review • Portable lab testing resources in place • Software development • Identifying features for improvement • Drone-mapping protocol

  15. Training Data • Explicit Training (~8 minutes) • Simultaneous Task Module • 3-4 terrain decisions per minute (~150 total) • ~10 trials / map on average • Sequential Task Module • 1-2 terrain decisions per minute (~81 total) • ~9 trials / map on average

  16. Training Data

  17. Performance Assessments

  18. Participant Questionnaire • Good • “It feels like I got better” • “The interface was easy to use” • “The video feels quite real” • “Easier when map feature was seen multiple times” • Bad • “Sometimes the video didn’t give enough signals of where I was at” • “Hard to tell distance” • “Difficult to tell hills from valleys”

  19. Bad Maps/Videos • “Sometimes the video didn’t give enough signals of where I was at” • “Difficult to tell hills from valleys”

  20. Issues and Opportunities • Issue: Map quality • Artificial topo surfaces are difficult to understand, not particularly realistic • Course of action: software improvement to enhance procedural content generation, integration of real-world maps via drone • Opportunities for collaboration • Ongoing Land Navigation training development, decision-making based on topographic map features

  21. Way Forward • Continuing pilot data collection (NU) • Extending training to multiple hours • Improving all stimuli • Engagement with instructors, trainees • SME feedback • Trainee experience testing • Applied field testing • Pending IRB, protocol stabilization, site selection • Summer 2019

  22. Acknowledgements • Northwestern University • Marcia Grabowecky, Ph.D., Kevin Schmidt, Brooke Feinstein, Ben Reuveni, Catherine Han, Ken Paller, Ph.D., Mark Beeman, Ph.D., Satoru Suzuki, Ph.D. • Captain Adam M. North, USMCMarine Officer Instructor/Assistant Professor of Naval Science, NROTC Chicago Consortium • Charles River Analytics • James Niehaus, Ph.D., Paul Woodall, William Manning

  23. Top-Level POA&M Start with the year your grant/contract was awarded. End with the year it expires.  O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S Feel free to add or delete columns. Create as many rows and corresponding symbols as required. Dotted lines can break up phases or options of your program, if you wish to use them. Capable Warrior Experiment ATD 3 Chassis Assembly Checkout  These events are examples of what you might include. Fill in this column with your own tasks. Survivability System Integration ATD 4 Chassis Assembly Checkout Gov’t Survivability Experiments Mobility/Durability Testing ATD 1, 2, 3 & 4 Delivery Active Suspension New Battery Power Converters System Control HW/SW Enhancement Dual Battery Integration at high power Government Mobility Tests See the notes section (below) for a list of milestones to include.

  24. Back up slides

  25. Program Title • Objective: • Describe the mission of your particular project. What are you being funded to do? Military Relevance/Operational Impact: • Describe the benefit to the warfighter. Naval/Marine Corps S&T Focus Areas: • List relevant USMC S&T Objectives here. Even though it’s in “backup,” THIS SLIDE IS REQUIRED. Please keep the font type the same; you can adjust the size, but make sure the text is large enough to read when printed. Relevant graphic with supporting / descriptive text TECHNICAL APPROACH: • Guidelines for the schedule: • Include ALL dates, from award to projected end of project. • Include experiments, demos, deliverables. • Write in relevant tasks and descriptions, then move the appropriate symbols on the timeline. Delete symbols you don’t need; add more symbols as needed. • About the schedule…: • The schedule layout is part of the slide background. You can only edit the text boxes and the placement of the shapes. • You can adjust the layout of this slide; there are a few with slightly different schedules (the years of funding, and number of tasks, vary). Pick the one that best suits you. • If one of these schedules won’t work for you, feel free to adjust the number of years/tasks yourself using the Master Layout. Please don’t paste an image here. TASK 1: Words Words TASK 2: Words Words TASK 3: Words Words TASK 4: Words Words TASK 5: Words Words PERFORMERS: Prime:x (company/university/agency name) Sub: xx (delete if no subcontractors) TASK 6: Words Words

  26. You can include additional slides (any format) here, if desired.

More Related