During a clinical trial by the research group BrainGate, which includes researchers from various American sites such as Brown University, Massachusetts General Hospital, and Stanford University, three patients with quadriplegia controlled a tablet with just their thoughts. This has been made possible thanks to the use of a brain-computer interface (BCI), a system that translates a user’s mental activity into messages or commands for an interactive application.
In particular, this one is classified as an intracortical BCI: the electrodes to detect brain signals are implanted directly in the motor cortex.
Such systems are having a rapid development and more and more applications: from the control of a robotic prosthesis to the control of a third arm by an able-bodied person.
The BrainGate’s BCI uses little multielectrode arrays implanted in the motor cortex which detects brain signals associated with specific movements. These signals, once acquired, processed and decoded, are sent to a Bluetooth interface which works as a wireless mouse. The virtual mouse was finally paired to an unmodified commercial tablet (Google Nexus 9). The three participants, two of them suffering from amyotrophic lateral sclerosis (ALS), while the third was paralyzed due to a spinal cord injury, then controlled the tablet with a point-and-click system, through their thoughts.
What is ALS?
Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative motor neuron disease. ALS is characterized by muscle stiffness, muscle contractions and gradual weakness due to decreased muscle size.
In a first calibration phase, the cursor automatically moved to various targets while the participants imagined moving the hand as if they were controlling the cursor. To implement the “click” of the mouse, instead, everyone has imagined performing a different movement, such as squeezing a hand or flexing an arm.
The study, published in PLOS ONE, reports that subjects were able to perform up to 22 “point and click” selections per minute and type up to 30 characters per minute, using classic email and chat interfaces.
During the experimentation, the three participants were able to chat with each other, search for music, browse on online shop sites and much more. One of them, a musician, played Beethoven’s “Ode to Joy” on a digital piano.
“For years, the BrainGate collaboration has been working to develop the neuroscience and neuroengineering know-how to enable people who have lost motor abilities to control external devices just by thinking about the movement of their own arm or hand. In this study, we’ve harnessed that know-how to restore people’s ability to control the exact same everyday technologies they were using before the onset of their illnesses. It was wonderful to see the participants express themselves or just find a song they want to hear.”
That’s what Dr. Jaimie Henderson, a senior author of the paper and a Stanford University neurosurgeon, said.
In addition to allowing patients with severe disabilities to return to using everyday technologies, such systems have the potential to create new ways of communication between patients with severe neurological deficits and the healthcare professionals who assist them.