Extended reality for people with
serious mobility and verbal
Focus Area(s): People with serious disabilities
Some people with various motor disabilities or after strokes have huge difficulties in communicating with each other and even to address their vital needs. The project will join the challenge to find a dedicated communication pathway for those people introducing the possibility to interact with some specific social cues and transform them in clear communication or actions. The proposal is to count on residual abilities giving them a meaning in terms of communication supported when needed by avatars in a virtual environment. The objective is to realise low-cost non-invasive tools based for instance on the existing biofeedback, face expression and other input. The challenge will be to create a solution allowing the design of person specific extended reality interaction even at home in the working place or at school and developing novel multi-user virtual communication and collaboration solutions that provide coherent multisensory experiences and optimally convey relevant social cues. Successful implementation of the pilot for this scenario allows people with communication and motor disabilities to interact with friends and relatives, meeting realistically in an extended (physical + virtual) environment. The person with communication and motor disabilities will be represented by an avatar which will interact with other people staying in a physical augmented environment. Simultaneously, the person with communication and motor disabilities will have the illusion to be also in the same physical environment with the other people, and can interact with the new interfaces offered by SUN. SUN, in fact, will develop a new generation of non-invasive bidirectional body-machine interfaces (BBMIs), which will allow a smooth and very effective interaction of people with different types of sensory-motor disabilities with virtual reality and avatars. The BBMI will be able to decode motor information by using arrays of printed electrodes to record muscular activities and inertial sensors. The position of the sensors will be customised according to the specific motor abilities of the subjects. For example, it could be possible to use shoulder and elbow movements, and it could be also possible to record muscular activities from auricular muscles. All these sensors will be used to record information during different upper limb and hand functions movements. We will use a procedure to identify the signals more useful for the different tasks and implement a dedicated decoding algorithm. This approach allowed developing a simplified and yet very effective approach to control flying drones and will probably provide very interesting results also in this case. A machine learning approach will be implemented for the decoding of the different tasks relying on human-machine interfaces and in general on decoding information from electrophysiological and biomechanical signals. SUN will also allow to give sensory feedback to the user about the movement and the interaction with the avatar in the virtual environment, using transcutaneous electrical stimulation or small actuators for vibration.
● Real Physical Haptic Perception Interfaces
● Non-invasive bidirectional body-machine interfaces
● Wearable sensors in XR
● Low latency gaze and gesture-based interaction in XR
● 3D acquisition with physical properties
● AI-assisted 3D acquisition of unknown environments with semantic priors
● AI-based convincing XR presentation in limited resources setups
● Hyper-realistic avatar
Results of this scenarios will provide a radical new perspective in terms of human capital improvement in R$I allow excluded people to communicate their ideas while providing huge benefits for humans going to use that technology. The project is expected to deliver usable prototypes that will open the way for quick industrialisation addressing improvements of people’s quality of life, work and education.