vr based toolsets for
play

VR-Based Toolsets for Instructors DR. Hashim Yaqub BMT , UK - PowerPoint PPT Presentation

VR-Based Toolsets for Instructors DR. Hashim Yaqub BMT , UK Abstract This paper outlines the development of a VR-based toolset which allow instructors to monitor the activity of students in a virtual-training scenario. They also allow


  1. VR-Based Toolsets for Instructors DR. Hashim Yaqub BMT , UK

  2. Abstract This paper outlines the development of a VR-based toolset which allow instructors to monitor the activity of students in a virtual-training scenario. They also allow instructors to issue aid and introduce dynamic obstacles or changes in task. The tool makes use of consumer VR technologies and incorporates gaze- aided interaction as well as design principals found in games using an omniscient viewpoint.

  3. ENGAGE  Rapid Virtual Training Prototyping  Learning Record Store  Tools for students and instructors  QEC, Type 26, Caimen-90 Fast, Sub Projects

  4. Use Case- QEC PC Mouse + Keyboard  Trainees/Students  Work individually or  collaboratively Practice, assessment or free-  roam Trainers  Curate Scenarios (Procedures  and Emergencies) Observe and Interfere in Live  Scenario

  5. Instructor Toolset Requirements  Monitoring multiple virtual students  PC or VR-based  Consumer (non-specialist tools)  Using standard VR Kit (controllers and/or integrated hand-tracking)  Communicating with one or more students  Interfering in live scenario i.e. changing objectives or introducing emergency situations  Automated and Manual guidance (e.g. Waypoints and hint beacons)  Easy search and selection of students, objects and areas

  6. Instructor Toolset Requirements Comprehensive Map Interaction  Swiftly view relevant areas on the map of the vessel  Isolate individual levels/decks.  Selecting and Directing  Identifying student avatars  Selecting Individual or Group  Communicating with Individual or Group  Targeting Waypointing and Direction  Curating Scenarios  Setting up procedures  Setting up routes  Setting up emergency situations  Activity and Progress Tracking  There should be a comprehensive UI for displaying the progress of each student and/or the task  they are performing. Using this same functionality the instructor can modify the task in real-time. 

  7. Omniscient VR  God Games  RTS/RPG  The Sims  XCOM  Deism

  8. PC vs VR  PC interface is well established  Mouse + Keyboard  Gaming  Universal System Requirements VR- Guide: A Specific User Role for Asymmetric Virtual Reality Setups in Distributed Virtual Reality Applications (2018)  Horst. R., Dorner, R. & Peter, M. 

  9. PC vs VR  VR  Can be more mobile (Oculus Quest)  Make use of virtual space to visualise large amount of information  Easier to manipulate and view 3D assets in using relatively more naturalistic interaction  Gaze-aided interaction

  10. Interaction Challenges  VR Interaction Modalities  Motion Controllers  Hand-Tracking  Gaze-Aided Interaction  Distal Pointing A human motor  Scenario Monitoring behavior model for distal pointing  After-Action Review tasks (2010)  Student Progress Kopper et. al  Student Location

  11. Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality (2018)  Xia. H., Herscher. S., Perlin. K., Wigdor., D 

  12. DualGaze: Addressing the Midas Touch Problem in Gaze Gaze Interaction Mediated VR Interaction (2018) Mohan et. al  Gaze input  Primary method of interaction  Using focal point and/or as direct input  Blink, timed focus, other novel methods Gaze + pinch  Gaze-Aided Interaction interaction in virtual reality (2017)  Using focal point to predict and augment interaction Pfeuffer et. al  Typical controller  Hand Tracking

  13. Gaze-Aided Interaction Setup Right Ray Hit Left Ray Hit Right Hand Left Hand EYE Ray Hit Yellow and green line indicating object being made ‘selectable’ for each hand. Gaze Influence Area Gaze influence area represented by mesh sphere around the eye hit point.

  14. Influencing Selectable Targets Left Ray Hit Altered Right Ray Hit Unaltered To Eye Hit Gaze influence comes into effect when eye hits a selectable target Gaze Influence Area Increased If hand ray hit falls within gaze influence area, the hand’s ‘selectable target’ is altered to eye target

  15. Influencing Selectable Targets Right Ray Hit Altered to Eye Hit Gaze influence comes into effect when eye hits a selectable target Gaze Influence Area Increased If hand ray hit falls within gaze influence area, the hand’s ‘selectable target’ is altered to eye target

  16. Adjust Influence Area for Distance Right Ray Hit Left Ray Hit Right Hand Left Hand EYE Ray Hit Gaze Influence Area Gaze influence area adjusted depending on distance from player

  17. Adjust Influence Area for Distance

  18. Experimental Conditions Hardware and Modalities  PC vs VR  Normal Interaction vs Gaze-Aided Interaction  Hand Tracking vs Controller  Task   Collocated or Remote Sequential or Free-form  Locating Actor  Different Floors   Multiple Actor Selection Methods  Direct Selection  UI Item

  19. Experimental Task Design  Checklist of Directives – Instruct students to…  Move to location  Interact with objects  Look at point of interest  Actors  Simple AI  Mouse + Keyboard  VR  Mixture of all?

  20. Working Toolkit [To be finished by final submission]

  21. How it Benefits Instructors  Enhanced instructor-student interaction  Designed to facilitate remote training and assessment  Comprehensive presentation of multiple student stats  Dynamic and customisable virtual workspace  Naturalistic and simple navigation of complex scaled virtual spaces  Useful for interior environments (i.e. vessels)  Provide multiple means of communication with students  Individual and Group Instructions  VOIP  Waypointing and Signposting

  22. Thank You! Any Questions?  Dr. Hashim Yaqub  Hashim.Yaqub@bmtglobal.com  01225473484

Recommend


More recommend