O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

Awe 2019 - Using AR and VR for Brain Synchroniztion

These sides are from the presentation given by Mark Billinghurst at the AWE 2019 conference in Santa Clara on May 30th 2019. The topic of the presentation was on Using AR and VR for Brain Synchroniztion.

  • Entre para ver os comentários

Awe 2019 - Using AR and VR for Brain Synchroniztion

  1. 1. USING AR AND VR FOR BRAIN SYNCHRONIZATION Mark Billinghurst May 30th 2019 AWE 2019
  2. 2. HMDs with Integrated EEG ● A number of HMDs available with integrated EEG ● Neurable – 6 electrodes in visual cortex ● Looxid - 6 electrodes frontal lobes Neurable Looxid
  3. 3. AR/VR and Brain Activity • Brain Computer Interfaces (BCI) • Using brain activity to control AR/VR experiences • Emotion/Cognitive Monitoring • Measuring emotion, stress, cognitive load • Enhancing Collaboration • Brain synchronization
  4. 4. AR/VR and BCI • Using brain activity to interact in AR/VR • Control by thought, user trained response • Hands free interaction
  5. 5. https://www.youtube.com/watch?v=qFildKVi5dY
  6. 6. Cognitive Monitoring
  7. 7. Virtual Reality Training • Bluedrop Training and Simulation • Helicopter hoist training 7
  8. 8. Adaptive Training • Training content dynamically changes depending on the learning state of the user • Task performance • Test performance • Subjective measures • None of these measure users mental activity or cognitive load during the task
  9. 9. Background ● VR training tools are effective and skills transferable to real world ● Earlier use of EEG in VR studies ○ Interaction, Measurement, VR environment did not adapt Zhang et al. [2017] - EEG for cognitive load - VR driving simulator - Autistic children Friedman et al. [2007] - EEG for locomotion - CAVE system
  10. 10. Motivation Making VR training systems adaptive in real-time to the trainee’s cognitive load to induce the best level of performance gain We are first to explore real-time adaptive VR training systems using workload calculated from EEG
  11. 11. Our System Oz, O1, O2, Pz, P3, and P4
  12. 12. Adaption/Calibration ● Establish baseline (alpha power) ● Two sets of n(1, 2)-back tasks to calibrate user’s own task difficulty parameters ● Measured alpha activity (task load) and calculated mean of the two tasks ● Mean → Baseline ● In experimental task ○ load > baseline → decrease level ○ load < baseline → increase level
  13. 13. Experimental Task • Target selection • number of objects, different colors • shapes, and movement Increasing levels (0 - 20)
  14. 14. Experimental Task Difficulty - Low Difficulty - High
  15. 15. User Study ● Participants ● 14 subjects (6 women) ● 20 – 41 years old, 28 years average ● No experience with VR ● Measures ○ Response time ○ Brain activity (alpha power) • 5 minutes fixed trial time
  16. 16. Adaption
  17. 17. Results – Response Time Increasing levels Response Time (sec.) No difference between easiest and hardest levels
  18. 18. Results – Time Frequency Representation • Task Load • Significant alpha synchronisation in the hardest difficulty levels of the task when compared to the easiest difficulty levels Easiest Hardest Difference
  19. 19. Key Finding Similar reaction time but increased brain activity showing increased cognitive effort at higher levels to sustain performance
  20. 20. Conclusions • Adaptive VR training can increase the user’s cognitive load without affecting task performance • First demo of the use of real-time EEG signals to adapt the complexity of the training stimuli in a target acquisition context • Future Work • Significantly increase task complexity • Can predict user performance based on the cognitive capacity • Using AR display • See real world and more distractors
  21. 21. Enhancing Collaboration
  22. 22. CAN MY THOUGHTS INFLUENCE YOURS?
  23. 23. Hyperscanning • The simultaneous acquisition or recording of neural activity from two or more individuals • Generally concerns the study of how two or more individuals interact in a co-operative or competitive scenario
  24. 24. Hyperscanning Experiment Setup
  25. 25. Brain Synchronization Phase extracted from the signals Time course of normalized EEG signal filtered in the alpha-mu frequency band for the channels P8 of both subjects * * Dumas, G., Nadel, J., Soussignan, R., Martinerie, J., & Garnero, L. (2010). Inter-brain synchronization during social interaction. PloS one, 5(8), e12166. Behaviour
  26. 26. Collaborative https://www.youtube.com/watch?v=CJIZkZ2ol5A In Opposition
  27. 27. Previous Research • Active research since 2002 • > 140 research papers • Excellent review • Wang, M. Y., Luan, P., Zhang, J., Xiang, Y. T., Niu, H., & Yuan, Z. (2018). Concurrent mapping of brain activation from multiple subjects during social interaction by hyperscanning: a mini-review. Quantitative imaging in medicine and surgery, 8(8), 819. • No hyperscanning research in AR/VR
  28. 28. Previous Studies TASKS EXAMPLES Imitation task Finger tracing, meaningless hand movements Coordination/joint task Rhythmic finger movements, unconsciously synchronised footsteps, Eye contact/gaze tasks Social interaction Economic games/exchanges Prisoner’s dilemma, Trust Game Cooperation and competition tasks Visual search task, Cooperation between Lovers vs Strangers, children and parents vs strangers, same or different gender Interactions in natural scenarios Daily life conversations (face to face: higher sync)
  29. 29. Example Results • Evidence suggests Hyperscanning is able to measure inter-brain synchrony Tang, H., X. Mai, S. Wang, C. Zhu, F. Krueger and C. Liu (2016). "Interpersonal Brain Synchronization In The Right Temporo-Parietal Junction During Face-To-Face Economic Exchange." Social Cognitive and Affective Neuroscience.
  30. 30. Benefits of Brain Synchronization • Several potential benefits • Improved engagement in learning • Improved feeling of “Flow” • Better collaboration performance • Increased trust • Great Social Presence • Better group social dynamics
  31. 31. Study 1 – Finger Tracking • Repeating classic study • Users track opposite fingers * Kyongsik Yun, Katsumi Watanabe, and Shinsuke Shimojo. “Interpersonal body and neural synchronization as a marker of implicit social interaction”. In: Sci Rep 2 (2012), pp. 959– 959.
  32. 32. Pre-training (Finger Pointing) Session Start
  33. 33. Pre-training (Finger Pointing) Session End
  34. 34. Post-Training (Finger Pointing) Session End
  35. 35. VR Experiment
  36. 36. Hypothesis • VR can reproduce Face to Face Brain Synchronization • AR/VR cues could be used to enhance Synchronization • Viewpoint sharing • Using shared virtual cues
  37. 37. VR Copy of Real World
  38. 38. Viewpoint Sharing
  39. 39. Brain Synchronization Simulation • Using a computational brain-inspired Spiking Neural Network Architecture - NeuCube • Modelling the brain synchronization from finger tracking EEG data is recorded over time (c) Mapping, Learning, Pattern Visualisation and Classification (a) Spatio-Temporal Input Data Stream (b) Data Encoding EEG recording Computational Modelling of Data in a 3D Brain-Inspired SNN System … … … Class A Class B Class C Class N Output Class Classification and Knowledge Extraction Converting EEG signals into sequence of spikes
  40. 40. Before Tracking Left Finger Participant 1 Before Tracking Right Finger Participant 3D visualisation 2D visualisation Input Interactions 3D visualisation 2D visualisation Input Interactions (a) (b) (c) (a) (b) (c)
  41. 41. After Tracking Left Finger Participant 1 After Tracking Right Finger Participant 2 3D visualisation 2D visualisation Input Interactions 3D visualisation 2D visualisation Input Interactions (a) (b) (c) (a) (b) (c)
  42. 42. Experiment A Accuracy Per Class % Total Accuracy % EEG Data Classes Class 1 ( AL-P1) Class 2 (AR-P2) 80 Class 1 ( AL-P1) 5 0 100 Class 2 (AR-P2) 2 3 60 Experiment C Accuracy Per Class Total Accuracy EEG Data Classes Class 1 ( CL-P1) Class 2 (CR-P2) 60Class 1 ( CL-P1) 3 2 60 Class 2 (CR-P2) 2 3 60 Classification accuracy of 10 EEG samples (5 samples per class) using leave one out cross validation method. The classification accuracy correspond to the similarity between the two models – the higher the classification accuracy – the more different the models are.
  43. 43. Conclusion
  44. 44. Conclusions • Opportunities for EEG use in AR/VR • Brain Computer Interaction • Cognitive Monitoring • Adaptive Training • Enhancing Collaboration • Brain Synchronization • Many directions for future research
  45. 45. Simulated Synchronization • Simulated brain synchronisation with a virtual character / avatar • Humans interaction with virtual agent e.g. BabyX • Measure human EEG and simulate virtual character EEG
  46. 46. Technology Trends • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  47. 47. Empathic Tele-Existence • Move from Observer to Participant • Explicit to Implicit communication • Experiential collaboration – doing together
  48. 48. www.empathiccomputing.org @marknb00 mark.billinghurst@auckland.ac.nz

×