Towards BCI-based Interfaces for Augmented ...
Type de document :
Compte-rendu et recension critique d'ouvrage
Titre :
Towards BCI-based Interfaces for Augmented Reality: Feasibility, Design and Evaluation
Auteur(s) :
Si-Mohammed, Hakim [Auteur correspondant]
3D interaction with virtual environments using body and mind [Hybrid]
Petit, Jimmy [Auteur]
École normale supérieure - Rennes [ENS Rennes]
Jeunet, Camille [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Chair in Brain-Machine Interface [Geneva] [CNBI]
Argelaguet Sanz, Ferran [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Spindler, Fabien [Auteur]
Sensor-based and interactive robotics [RAINBOW]
Evain, Andéol [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Roussel, Nicolas [Auteur]
Inria Bordeaux - Sud-Ouest
Casiez, Géry [Auteur]
Technology and knowledge for interaction [LOKI]
Lécuyer, Anatole [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
3D interaction with virtual environments using body and mind [Hybrid]
Petit, Jimmy [Auteur]
École normale supérieure - Rennes [ENS Rennes]
Jeunet, Camille [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Chair in Brain-Machine Interface [Geneva] [CNBI]
Argelaguet Sanz, Ferran [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Spindler, Fabien [Auteur]
Sensor-based and interactive robotics [RAINBOW]
Evain, Andéol [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Roussel, Nicolas [Auteur]
Inria Bordeaux - Sud-Ouest
Casiez, Géry [Auteur]
Technology and knowledge for interaction [LOKI]
Lécuyer, Anatole [Auteur]
3D interaction with virtual environments using body and mind [Hybrid]
Titre de la revue :
IEEE Transactions on Visualization and Computer Graphics
Pagination :
1608-1621
Éditeur :
Institute of Electrical and Electronics Engineers
Date de publication :
2018-10
ISSN :
1077-2626
Mot(s)-clé(s) en anglais :
Optical see-through
Robot control
Design space
SSVEP
User interface
Augmented reality
Brain-computer interface
Human-Computer Interaction
Robot control
Design space
SSVEP
User interface
Augmented reality
Brain-computer interface
Human-Computer Interaction
Discipline(s) HAL :
Informatique [cs]/Synthèse d'image et réalité virtuelle [cs.GR]
Informatique [cs]/Interface homme-machine [cs.HC]
Informatique [cs]/Interface homme-machine [cs.HC]
Résumé en anglais : [en]
Brain-Computer Interfaces (BCIs) enable users to interact with computers without any dedicated movement, bringing newhands-free interaction paradigms. In this paper we study the combination of BCI and Augmented Reality ...
Lire la suite >Brain-Computer Interfaces (BCIs) enable users to interact with computers without any dedicated movement, bringing newhands-free interaction paradigms. In this paper we study the combination of BCI and Augmented Reality (AR). We first tested thefeasibility of using BCI in AR settings based on Optical See-Through Head-Mounted Displays (OST-HMDs). Experimental results showedthat a BCI and an OST-HMD equipment (EEG headset and Hololens in our case) are well compatible and that small movements of thehead can be tolerated when using the BCI. Second, we introduced a design space for command display strategies based on BCI in AR,when exploiting a famous brain pattern called Steady-State Visually Evoked Potential (SSVEP). Our design space relies on fivedimensions concerning the visual layout of the BCI menu ; namely: orientation, frame-of-reference, anchorage, size and explicitness. Weimplemented various BCI-based display strategies and tested them within the context of mobile robot control in AR. Our findings werefinally integrated within an operational prototype based on a real mobile robot that is controlled in AR using a BCI and a HoloLensheadset. Taken together our results (4 user studies) and our methodology could pave the way to future interaction schemes in AugmentedReality exploiting 3D User Interfaces based on brain activity and BCIs.Lire moins >
Lire la suite >Brain-Computer Interfaces (BCIs) enable users to interact with computers without any dedicated movement, bringing newhands-free interaction paradigms. In this paper we study the combination of BCI and Augmented Reality (AR). We first tested thefeasibility of using BCI in AR settings based on Optical See-Through Head-Mounted Displays (OST-HMDs). Experimental results showedthat a BCI and an OST-HMD equipment (EEG headset and Hololens in our case) are well compatible and that small movements of thehead can be tolerated when using the BCI. Second, we introduced a design space for command display strategies based on BCI in AR,when exploiting a famous brain pattern called Steady-State Visually Evoked Potential (SSVEP). Our design space relies on fivedimensions concerning the visual layout of the BCI menu ; namely: orientation, frame-of-reference, anchorage, size and explicitness. Weimplemented various BCI-based display strategies and tested them within the context of mobile robot control in AR. Our findings werefinally integrated within an operational prototype based on a real mobile robot that is controlled in AR using a BCI and a HoloLensheadset. Taken together our results (4 user studies) and our methodology could pave the way to future interaction schemes in AugmentedReality exploiting 3D User Interfaces based on brain activity and BCIs.Lire moins >
Langue :
Anglais
Vulgarisation :
Non
Collections :
Source :
Fichiers
- https://hal.inria.fr/hal-01947344/document
- Accès libre
- Accéder au document
- https://hal.inria.fr/hal-01947344/document
- Accès libre
- Accéder au document
- https://hal.inria.fr/hal-01947344/document
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- Manuscript.pdf
- Accès libre
- Accéder au document
- Manuscript.pdf
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- Manuscript.pdf
- Accès libre
- Accéder au document