ESMED small

Novel Learning Algorithms for Real-Time Brain-Computer Interfaces and Neuroengineering

Brain-computer interface (BCI) is a particular human-machine interfacing system that processes users’ brain activities and translates them into commands to control external assistive devices. Overall, BCI proposes a promising solution to directly connect the human brain to neuroprostheses, en route to developing a bidirectional, closed-loop interface between amputees and their robotic prostheses. However, a good BCI system for neuroprosthetics control and human-robot interaction should allow the reliable classification of complex movements from users’ intentions when interacting in real-time with their assistive devices, and more importantly should decode the human brain perception of different external sensory stimuli, ranging from gentle touch to very intense sensations. Within this research, we demonstrate that one way to approach this problem is by exploiting electroencephalography (EEG) signals alongside other biosignals, such as electrooculography (EOG) and electromyography (EMG). As will be shown, BCI represents an intriguing technology for feed-forward control, as well as for sensory substitution. When focusing on both sides of the equation, two main control methods were investigated. From a
feed-forward control standpoint, the following approaches were pursued: First, Only EEG signals were used for decoding two motor imagery (MI) movements when controlling a robotic arm. For that, we validated different EEG learning models, ranging from deep learning to spiking learning networks implemented on dedicated neuromorphic hardware (SpiNNaker). Overall, the validated learning models outperformed the state-of-the-art machine-learning techniques. The successful decoding of these two MI movements spurred the exploration of other techniques to decode more complex arm movement intentions and led to a second follow-up study. For that purpose, EOG+EMG and EEG+EMG were used to accurately classify five and six reach-to-grasp movements. For the sensory substitution part, vibrotactile stimulation was investigated, and four human subjects were able to control in real-time four gestures of a prosthetic hand while being able to simultaneously differentiate between the stiffness of three different grasped objects. Thereafter, two follow-up studies were designed, where EEG was used to quantify the brain perception of electrical and thermal sensory stimulations. For that, we reported novel findings on Spatio-temporal brain patterns when perceiving different stimuli. In parallel, we investigated the use of such a decoding system to develop a real-time withdrawal system in embodied prostheses, as well as its general application in real-time human-robot interaction and the development of autonomous robotic systems. Ultimately, this research work envisions contributing to accelerating the development of the next generation of sensory-enabled intelligent neuroprosthetic devices.

Call for papers

Have a manuscript to publish in the society's journal?