Advertisement

Watch: Man controls a robotic arm with his thoughts

Share via

The day Erik Sorto reached out to grab himself an ice cold beer was a major step forward for brain science.

Sorto is quadriplegic and has been unable to move his own limbs since a bullet wound severed his spinal cord 12 years ago.

In the years since his injury, Sorto managed to attend college and write a gang prevention book, but one thing that frustrated the 34-year-old most of all was that he couldn’t pick up a cold one and drink it at his own pace without having to ask a caregiver for help.

Advertisement

“Drinking a beer by myself gives me hope that somehow in the future I can regain a lot more independence,” he said.

Sorto still can’t move his own arms and legs, but after two years of hard and often frustrating work, he is able to use his thoughts to control a robotic arm. And he is able to do this with enough dexterity that he can now tell the arm to pick up a bottle of beer, bring it to his mouth, hold it there while he sips from a straw and set it down when he is done.

The neuroengineering that made his achievement possible was described this week in the journal Science.

Advertisement

The first time Sorto lifted the beer to his mouth, he was so exhilarated that he lost his concentration and dropped the bottle, spilling the beverage all over himself.

Sorto wasn’t the only one overjoyed by his beer-drinking feat. His excitement was shared by a cadre of scientists and engineers who wanted to know whether he could get a robotic arm to take instructions from a part of the brain called the posterior parietal cortex.

So far, most of the work in the field known as neuroprosthetics has been focused on the region of the brain called the motor cortex - an area that fires off directions to our muscles when we walk, type on a keyboard or take a sip of water.

Advertisement

The posterior parietal cortex, on the other hand, is involved in the planning of those actions. For example, the PPC does not execute the individual finger movements that allow us to type “HELLO,” but it tells the motor cortex that typing “HELLO” is our intent.

“It’s one step further removed from the muscles,” said Tyson Aflalo, a biological engineer at Caltech who led the effort to get the robotic arm to communicate with the PPC. “It performs a higher, more abstract function.”

Using the motor cortex to control a robotic arm often results in jerky movements because the subject has to think about each individual action - move arm down, open hand, close hand, bring hand up.

But controlling the arm with the higher-level signals that are generated in the PPC – like reach down, pick up and bring to my mouth – would be better because a sufficiently smart robot could read the subject’s intent and program those movements more fluidly, the researchers figured.

“By coding the goal of the movement, you get it to be smoother because you don’t have to worry about controlling all the details,” said Richard Andersen, a neuroscientist at Caltech and the senior author of the study.

Andersen’s previous work has shown that it is possible to read the intent of a monkey’s action by eavesdropping on the neurons located in the PPC, but it had never been tested in humans before.

Advertisement

“I was worried if we could activate them as easily as we do in the monkeys,” he said. “It’s a bit like exploring new territory.”

In April 2013, a surgical team at Keck Medicine of USC implanted two small electrode grids in the PPC region of Sorto’s brain. These allowed a computer to record the electrical signals emanating from the neurons there.

Each of the grids is about the size of a penny and has 96 electrodes coming off it. There’s also a bundle of wires that reach up through Sorto’s skull.

Two and a half weeks after the surgery, Sorto was ready for his first session with the research team. The scientists began by making a map of his neural signals. To do this, they had him watch movements of the robotic limb and imagine he was controlling it.

“You are basically teaching your decoding algorithm what the neural activity means,” Aflalo said. “The neurons are speaking, but you need to learn what it is saying.”

The process took just a few minutes. By the end of the first session, Sorto was already able to control he robotic arm with his thoughts alone.

Advertisement

Over the next two years, Sorto met with the research team at Rancho Los Amigos National Rehabilitation Center in Downey four times a week for about four hours a day.

“In the beginning it was stressful,” Sorto said. “But after two-plus years, it is a lot easier. I feel me and the arm are connecting a lot better.”

One of the most difficult parts for Sorto and the research team was that the electrodes did not always encounter the same neurons.

“Certain neurons would disappear but other neurons would appear,” Aflalo said. “Some days we ended up with neurons that were better at enabling him to control things, and other days we would end up with neurons that were less good.”

The research team is not yet sure why these changes occur, but one possibility is that the electrodes can shift their position because they are not secured to any part of the brain.

Despite these occasional setbacks, Sorto has been able to control the robotic arm well enough to point left and right, pick things up, make pieces of art, and even make smoothies for the other members of the team using nothing but the intentions in his mind.

Advertisement

Krishna Shenoy of the Neural Prosthetic Systems Lab at Stanford University said the Caltech-USC team was the first to show that the PPC region of the brain provides useful signals for neural prosthetic control.

He added that one of the potential benefits for using the PPC is that it might ultimately be useful for a much wider range of control.

“We live in an era of smart homes where we want everything in our house to be tied back to our smartphones,” said Shenoy, who wasn’t involved in the Science study. “That means that a direct brain-to-interface-of-things becomes possible.”

For example, he said, a person might one day be able to think “television, turn on” or “email, open” and those actions would happen.

“The planning activity in the PPC is much more likely to give you that future possibility,” he said.

Science rules! Follow me @DeborahNetburn and “like” Los Angeles Times Science & Health on Facebook.

Advertisement
Advertisement