Melding Mind and Machine

  • Published2 Aug 2016
  • Reviewed2 Aug 2016
  • Author Stephani Sutherland
  • Source BrainFacts/SfN
Photo of Erick Sorto drinking a glass of water assisted by a robotic arm that he controls with his thoughts.
Erik Sorto takes a drink with a robotic arm that he controls with his thoughts.
Photo of Erick Sorto drinking a glass of water assisted by a robotic arm that he controls with his thoughts.
A researcher attaches electrotrodes to a man's scalp.
Electrodes from an electroencephalography (EEG) machine—once the gold standard for recording brain activity -- cover a man’s head. New technologies can provide researchers with information as detailed as the signals produced by individual neurons.
Society for Neuroscience
A researcher attaches electrotrodes to a man's scalp.

In 2015, Erik Sorto did something he thought he’d never do again: he raised a bottle of beer to his lips and took a drink. Paralyzed below the sternum for more than a decade after a gunshot damaged his spinal cord, Sorto didn’t use his own hand to hold his beverage, instead he controlled a robotic arm with his own thoughts.

Sorto’s feat represents a vast improvement to the brain-machine interface: a neuroprosthetic device composed of micro-electrodes implanted in his brain and the computer algorithm that translates those “thoughts” into movement. This remarkable achievement is made possible by a host of important neuroscience advances drawing from computing and mathematics along with biology and engineering to create new understanding, new tools, and, ultimately, new treatments for brain disorders and disabilities. All these areas are a priority for the BRAIN Initiative, which has a key goal to leverage engineering, physical sciences, and math to speed advances in brain research.

Previous neuroprostheses had recorded activity from neurons in the motor cortex, but the resulting instructions produced jerky movements. For the new robotic arm, Richard Andersen, at California Institute of Technology, recorded activity from a region of the brain responsible for planning movements, the posterior parietal cortex. That resulted in a smoothly moving arm capable of executing more complex tasks.

Sorto’s robotic arm represents the cutting edge in neuroprosthetics. To get to this point, the field of neuroscience had to meet two critical challenges: recording neural activity, and making sense of it. Advances in recording brain activity and computational neuroscience have not only driven human applications like neuroprosthetics; they have helped illuminate the basic workings of the brain in ways researchers have only dreamt of, says Sydney Cash, a researcher at Massachusetts General Hospital.

Gathering data…

For decades, neuroscientists have recorded electrical activity from individual neurons. These studies have been crucial to glean an understanding of how brain cells communicate with one another even if the technique is largely limited to experiments in a petri dish or in animals. Getting a complete picture of how brain activity gives rise to complex thoughts requires data from populations of neurons. “A single neuron cannot possibly carry all the info contained even in one small area of the brain; instead, information is represented by many, many neurons of different types,” Cash says.

For nearly a century researchers employed a technology at the opposite end of the spectrum: electroencephalography (EEG). Capable of recording the summed activity from millions of neurons at once, EEG involves placing electrodes on the scalp of a human or animal and provides researchers an affordable, non-invasive way to record brain activity. “An EEG gives you an idea of what large numbers of neurons are doing, but it can’t divide out signals from individual neurons,” Cash says noting researchers can either record a single neuron or huge numbers of them. “Both are useful, but neither contains all the information you might want.”

In the past decade, researchers have been trying to hit a sweet spot where they can get detailed information about how populations of neurons operate. Critical to that effort has been the development of multi-electrode arrays—devices that pick up signals from hundreds of individual nearby neurons at once. These arrays can be used in a dish to record from transplanted neurons, but they can also be implanted for use in animals and even humans as with the case with Erik Sorto.

It’s a level of information Cash says is critical to improving brain function in healthy and disease states. “With new technology supported by the BRAIN initiative, we are finding better ways to record from large numbers of neurons. There’s a huge amount of information to be mined if we can get to it.”

…and making sense of it

Decoding all that information recorded with new technologies and putting it to use is a computational tour de force.

“There are an estimated [86 billion neurons] in the brain, and each one is a very noisy purveyor of information,” says Emery Brown, a researcher at MGH and the Massachusetts Institute of Technology. “The information required to execute an action, a movement, a thought, is conveyed by groups of neurons.”

After recording from large numbers of neurons that contribute to a particular action, researchers create computer programs, or algorithms, to interpret the information that ultimately conveys a command or perhaps generates a thought.

Brown and colleagues developed an algorithm to make anesthesia safer and more practical. A medically-induced coma represents the deepest state of anesthesia where the brain is mostly silent with intermittent bursts of activity. “Following a head injury, we might place someone in a medical coma so the brain can rest, swelling can go down; we put the brain into a low-energy state to allow it to heal,” Brown says.

Currently, an anesthesiologist or nurse must continually monitor a patient’s brain state and deliver just the right amount of drug. But in a 2013 study in rodents, Brown used an algorithm to translate the data from an EEG recording into precise, real-time automated drug delivery to maintain an anesthetized state. The technology could reduce the risks of drug toxicity and the cost of care.

Precisely controlling a deep state of anesthesia requires an algorithm capable of handling significant complexity. An algorithm for a natural behavior like foraging is even more complex. Xaq Pitkow, a joint faculty member of Baylor College of Medicine and Rice University, and Dora Angelaki, professor of neuroscience at Baylor, intend to tackle that problem with National Science Foundation funds from the BRAIN Initiative.

Pitkow and Angelaki will train mice to forage in a virtual reality environment while recording from four areas of the brain associated with vision and navigation. The team will then use the data collected to understand how different parts of the brain cooperate, adapt, and drive behavior. Pitkow told Rice University that neuroscientists usually examine neural computations during simple tasks. However, today’s tools and understanding allows them to consider developing mathematical models that factor in the complexity associated with fluid and adaptive behaviors.

Brown stressed the importance of better algorithms and tools. “We need the tools that are coming out of the BRAIN Initiative to understand neurons’ individual characteristics: are they inhibitory or excitatory? Who are they talking to and how do they talk the way they do? If we can couple the data-handling advances with an understanding of who these neurons are, that will add a level of understanding that will let us get to these incredibly important basic science questions.”

Those answers feed back into efforts to develop ever more powerful neuroprosthetics. While Erik Sorto controls a robotic arm with his thoughts, he can’t feel if the glass is slick with condensation and slipping from his grip. For all his brain’s effort to move that arm, it gets no feedback.

"Closing the circuit to take signals from the arm and send them back to the brain is really important to carrying out daily activities,” said," Justin Sanchez, the director of DARPA's Biological Technologies Office. DARPA, a BRAIN Initiative funder, has been working to develop revolutionary prosthetics used by service members who’ve suffered arm amputations. “Being able [to sense touch] is a significant functional benefit for people.”

Sanchez highlighted just such DARPA-funded research at its “Wait, What?” event in St. Louis in 2015: a robotic arm that takes signals from pressure sensors on the tips of its fingers and relays them to a neuroprosthetic implanted in the primary sensory cortex of a person’s brain. The device allowed a quadriplegic man to sense which finger of the robotic arm was being touched.

The demonstration required wires from a computer that attach to a connector on the person’s head. Still, Sanchez envisions a time when neuroprosthetics will not only close the loop between the brain and a prosthetic without external wires, but may also help people overcome things like memory problems caused by damage from traumatic brain injury.

Cash agrees. One day brain-machine interfaces may restore sensation after spinal cord injury or stroke, vision to the blind and hearing to the deaf among other things “I think we will see a lot of those things in our lifetime,” he says. “It’s a brave new world we are living in.”


This article was produced in conjunction with the Brain Initiative Alliance, a supporting partner of BrainFacts.org.

CONTENT PROVIDED BY

BrainFacts/SfN

Neuroscience in the News

Check out the latest news from the field.

Read More

Research & Discoveries

See how discoveries in the lab have improved human health.

Read More

Personalize Your Emails

Personalize your monthly updates from BrainFacts.org by choosing the topics that you care about most!

Sign Up