Extended reality for people with
serious mobility and verbal
communication diseases

Focus Area(s): People with serious disabilities

Scenario description:

Pilot 3 of the SUN Project marks a significant advancement in the development of human-centered rehabilitation technologies, specifically designed for individuals with severe motor and verbal communication disabilities. Unlike traditional uses of eXtended Reality (XR) that focus on entertainment or training, this pilot emphasizes emotional connection, social presence, and motivation. The main objective was to prove that, by leveraging intelligent sensing and immersive environments, people with profound physical impairments could regain agency communicating, making choices, and interacting meaningfully with others.

At the heart of the SUN XR platform is a sophisticated, non-invasive body-machine interface. This interface utilizes electromyography (EMG) to detect even minimal muscle contractions, such as those in the forearm often the only voluntary movement left in patients with tetraplegia or severe stroke. These muscle signals are decoded in real time and translated into virtual actions, like moving an avatar or grasping objects, enabling intuitive and expressive interaction without the need for traditional input devices.

To enhance the sense of immersion and embodiment, the system provides multisensory feedback. Users receive tactile cues through a vibrotactile armband, signalling when gestures are recognized or actions are completed. Additionally, thermal feedback allows users to feel temperature changes, such as the coolness of a glass or the warmth of a virtual handshake. This combination of visual, tactile, and thermal input makes virtual interactions feel more real and engaging.

Pilot 3 applied this integrated technology to two main clinical scenarios. The first scenario, Immersive Interaction for Individuals with Limited Mobility, was aimed at patients with incomplete spinal cord injuries or post-stroke motor deficits. The virtual environment, built with Unity, allowed users to perform everyday activities approaching objects, opening doors, or expressing basic needs using EMG control. For non-verbal individuals, a communication module enabled the creation of simple sentences, providing a much-needed avenue for social participation and emotional expression.

The second scenario addressed clinical apathy, a significant barrier in neurorehabilitation. Apathy, commonly seen after stroke or brain injury, reduces motivation to engage in therapy. The SUN team adapted an effort-based decision-making task into a VR format, where patients could earn emotionally relevant rewards, such as interacting with a loved one’s avatar or accessing enjoyable media. This scenario is being further enhanced with non-invasive brain stimulation to target deep reward circuits, aiming to boost motivation and engagement.

The SUN platform’s modular architecture includes an EMG decoding unit, haptic and thermal feedback modules, emotional-state monitoring sensors, and a central VR application that integrates all data streams in real time. Validation took place in two phases: technical validation with healthy volunteers at EPFL’s Campus Biotech in Geneva, and a clinical feasibility study at the Clinique Romande de Réadaptation in Sion, Switzerland. Participants found the system stable, intuitive, and highly immersive. Patients appreciated the comfort and motivational benefits, and clinicians noted its usability and potential to complement traditional therapy.

Additionally, the platform was tested for cybersecurity by simulating GPU-based malware attacks during XR sessions. The intrusion detection system successfully identified threats with no false alarms, demonstrating the feasibility of real-time security monitoring.

On YouTube SUN channel you can watch a video showing the validation performed in Versilia Hospital.

Technical challenges:

● Real Physical Haptic Perception Interfaces
● Non-invasive bidirectional body-machine interfaces
● Wearable sensors in XR
● Low latency gaze and gesture-based interaction in XR
● 3D acquisition with physical properties
● AI-assisted 3D acquisition of unknown environments with semantic priors
● AI-based convincing XR presentation in limited resources setups
● Hyper-realistic avatar

Impact:

Results of this scenarios will provide a radical new perspective in terms of human capital improvement in R$I allow excluded people to communicate their ideas while providing huge benefits for humans going to use that technology. The project is expected to deliver usable prototypes that will open the way for quick industrialisation addressing improvements of people’s quality of life, work and education.

Latest News

AI. XR and the patient perspective

AI. XR and the patient perspective

1. Introduction Context AI-driven processes are transforming healthcare by enabling new diagnostic methods, personalizing treatments, improving administrative efficiency, and driving research innovations. At the same time, these benefits must be critically evaluated...

read more
SUN Project Shines at HELT Symposium: Pioneering Human-Centered XR for One-Health

SUN Project Shines at HELT Symposium: Pioneering Human-Centered XR for One-Health

The EU Horizon SUN Project (Social and Human-Centered XR) took center stage at the Health, Law, and Technology (HELT) Symposium on April 24th, hosting a groundbreaking workshop titled “Social and...

read more