This objective will be reached via the development of a number of solutions for innovative XR interaction. We will develop solutions for gaze-based interaction and gesture-based interaction based on computer vision and wearable sensors. We will also develop solutions for multimodal, multi-user interaction in XR through haptic interfaces. To find a dedicated communication pathway for people with different abilities, we will develop Bidirectional body-machine interfaces (BBMI) able to decode motor and communication information by using all available interfaces. The BBMI will also be able to deliver sensory feedback to the user about the movement and the interaction with the avatar in virtual reality.

 

 

 

 

 

Skip to content