In a slow-mo game of mental Whack-a-Mole, two paralyzed people, using only their thoughts and an implanted brain sensor, manipulated a robotic arm to grasp foam balls. The achievement, reported in the May 17 Nature, is the first instance of a human brain-computer interface controlling such complex, three-dimensional motion. In a more practical test, one woman was able to grab a thermos of coffee with the robotic arm and bring it to her lips for a drink. While many challenges remain to bring this technology to people who need it, the time to that point will be measured in “years, not decades,” suggested study authors Leigh Hochberg and John Donoghue of Brown University and the Veterans Affairs Medical Center, both in Providence, Rhode Island.

The work is part of the ongoing BrainGate pilot trial by researchers at the Rhode Island institutions and Massachusetts General Hospital in Boston. Previously, the researchers employed the same implant to control a computer cursor that allowed paralyzed people to open e-mails or play a game (Hochberg et al., 2006). Another team, using a similar interface, has trained macaques to feed themselves with a mechanical arm (Velliste et al., 2008), but nothing that elaborate has been repeated in humans. Thus far, seven people have tested out the BrainGate implant, including two paralyzed by stroke, two by spinal injury, and three by amyotrophic lateral sclerosis (ALS). The current study focuses on two people; 66-year-old Robert, and longtime volunteer, 58-year-old Cathy Hutchinson. Both suffered brainstem strokes.

The signal to move one’s arm originates in the motor cortex. In BrainGate participants, a four by four millimeter, 96-electrode sensor placed into the motor cortex eavesdrops on neuronal firing and transmits that signal via wires through the skull right out of the top of a person’s head. A computer receiving the information then decodes the neural chatter. To train the computer, the researchers ask the user to watch as the robot arm performs different motions, and imagine he or she is moving the arm.

Once the system learns what neurons go with what movements, it can translate the incoming electrical impulses into commands for the robot, giving the paralyzed person control over the mechanical arm. Robert said of the experience, “I just imagined moving my own arm, and the [robotic] arm moved where I wanted it to go,” as the researchers related during a Nature press conference.

The study tried out two different robot arms, one designed as a prosthetic from DEKA Research and Development Corporation in Manchester, New Hampshire, and one larger, independent arm designed by co-senior author Patrick van der Smagt and colleagues at the Institute of Robotics and Mechatronics in Oberpfaffenhofen near Munich, Germany. To measure how well the two participants could control the arms, the researchers devised a test in which foam ball “targets” were raised above a tabletop. The goal was to reach out and grab the ball in fewer than 30 seconds. Robert succeeded in 62 percent of tries, while Cathy captured the ball in 46 percent of her attempts with the DEKA arm. Finally, Cathy tried six times to grasp and sip her coffee through a straw, and managed four drinks (see video below).

  • Click on the image to launch the video.

 

The Future of Brain-Computer Interfaces
In this video, Cathy Hutchinson controls a robotic arm to grasp a thermos of coffee, bring it to herself, and tip it toward her mouth to drink. It is the first time in 15 years she has served herself a beverage, and the first time a person has used a brain-computer interface (BCI) for such a complicated, yet practical, spatial task. While Cathy controlled the two-dimensional movement of the arm, the computer contributed the lifting power in this exercise. Scientists hope that one day they may be able to control human limbs with the same technology. Video courtesy of Nature

Another current trial participant is a man with ALS, which scientists are finding affects not only the spinal motor neurons, but brain neurons as well (see ARF related news story). Many neurologists might have predicted a person with ALS would be unable to generate the signal needed for BrainGate to work, said trial leader and first author Hochberg, who also holds an appointment at MGH. In fact, “We have been able to record lots of neurons from the motor cortex of people with advanced ALS, and we have also shown some cursor control.” Similarly, the researchers were impressed to see that Cathy’s motor cortex was still capable of sending arm-directing messages 15 years after she last moved her limbs.

Another practical concern for brain-computer interface (BCI) use is that the implanted equipment must last. The researchers were pleased to see that Cathy’s electrodes still worked five years after implantation. “This is a new longevity benchmark for brain-computer interfaces,” Donoghue said during the press conference. While the system was listening to fewer neurons than when it was brand new, Hochberg told ARF, the remaining signals were good enough to perform the reach-and-grasp task.

The intracortical recording used by Hochberg and Donoghue is one among a handful of approaches for BCIs (see ARF related news story; reviewed in Shih et al., 2012). The other main techniques are electroencephalography (EEG) and electrocorticography (ECoG). Each has its pros and cons. While EEG is convenient and noninvasive, with electrodes being placed outside the skull, it is a bit like listening at the closed door of a cocktail party. ECoG, in which electrodes are placed under the skull on the brain’s surface, is like standing in the room, but on the edge of the action. In contrast, an intracortical electrode hears the details of one group talking, but may miss the big picture.

While many scientists suspect that intracortical recordings are ideal, as of yet there is no proof that implanted electrodes make BCIs work any better than EEG, said Jerry Shih of the Mayo Clinic in Jacksonville, Florida, who was not involved with the current study. He and Hochberg agreed there would likely be a place for all modalities in different BCI devices, perhaps based on the task at hand or on personal preference. For example, researchers at NeuroSky in Hong Kong are testing an EEG communication device in people with ALS (Mak et al., 2012).

One criticism of intracortical recording has been that the area it reaches is so small, Shih said, but “I think Leigh and his group just demonstrated that maybe a small area gives us enough information to do a fair number of things.” While moving one’s arm typically engages thousands of neurons, the BrainGate was able to infer the plan from just dozens. The researchers might also try two electrode arrays in different areas, Hochberg told ARF.

As Cathy took her first sips of coffee, “the smile on her face was something that I believe our whole research team will never forget,” Hochberg said during the press conference. Nevertheless, “there is undoubtedly still work to do.” Two crucial areas for improvement are accuracy and speed, Shih noted. It took nearly 10 seconds, on average, for the participants to grab the foam balls. That is a long time to be groping for, say, a ringing telephone. Hochberg and Donoghue are also working to make the system wireless, so users do not have to connect to the BCI via a cranial socket. Eventually, the BrainGate researchers would like to marry the BCI with functional electrical stimulation of muscle, so that people’s brainwaves could signal their own limbs to move.

“Ultimately, the greatest obstacle to clinical applications of neural interfaces may come not from science or engineering, but from economics,” wrote Andrew Jackson of Newcastle University in Newcastle upon Tyne, U.K., in a commentary accompanying the Nature paper. BrainGate was originally run by the company Cyberkinetics, which folded in 2009. However, the trial has since received funding from the National Institutes of Health, the Department of Veterans Affairs, and the Department of Defense. The military is keen on BCI research, not only to help wounded soldiers, but also to link up instruments with the brains of healthy soldiers, Shih said. “From a funding standpoint, the field of BCI is in a better place than many other medical disciplines,” he said. Hochberg compares the rise of BCIs to that of deep-brain stimulation, which he noted once seemed an invasive, expensive device, but now helps more than 80,000 people (see ARF related news series).—Amber Dance

Comments

No Available Comments

Make a Comment

To make a comment you must login or register.

References

News Citations

  1. London, Ontario: More Than Motor Malaise at ALS Meeting
  2. Mind-machine Meld: Brain-computer Interfaces for ALS, Paralysis
  3. Deep-Brain Stimulation: Decade of Surgical Relief, Not Just for PD

Paper Citations

  1. . Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature. 2006 Jul 13;442(7099):164-71. PubMed.
  2. . Cortical control of a prosthetic arm for self-feeding. Nature. 2008 Jun 19;453(7198):1098-101. PubMed.
  3. . Brain-computer interfaces in medicine. Mayo Clin Proc. 2012 Mar;87(3):268-79. PubMed.
  4. . EEG correlates of P300-based brain-computer interface (BCI) performance in people with amyotrophic lateral sclerosis. J Neural Eng. 2012 Apr;9(2):026014. PubMed.

External Citations

  1. BrainGate

Further Reading

Papers

  1. . Toward a brain-computer interface for Alzheimer's disease patients by combining classical conditioning and brain state classification. J Alzheimers Dis. 2012;31 Suppl:S211-20. PubMed.
  2. . Neurofeedback and brain-computer interface clinical applications. Int Rev Neurobiol. 2009;86:107-17. PubMed.
  3. . Bridging the brain to the world: a perspective on neural interface systems. Neuron. 2008 Nov 6;60(3):511-21. PubMed.
  4. . The science of neural interface systems. Annu Rev Neurosci. 2009;32:249-66. PubMed.
  5. . Direct control of paralysed muscles by cortical neurons. Nature. 2008 Dec 4;456(7222):639-42. PubMed.
  6. . Brain-computer interfaces: communication and restoration of movement in paralysis. J Physiol. 2007 Mar 15;579(Pt 3):621-36. PubMed.

Primary Papers

  1. . Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature. 2012 May 17;485(7398):372-5. PubMed.
  2. . Neuroscience: Brain-controlled robot grabs attention. Nature. 2012;485(7398):317-8. PubMed.