researchers have paved the way for a less invasive option. They’ve used ultrasound imaging to predict a monkey’s intended eye or hand movements—information that could generate commands for a robotic arm or computer cursor.

The most advanced mind-controlled devices being tested in humans rely on tiny wires inserted into the brain. Now researchers have paved the way for a less invasive option. They’ve used ultrasound imaging to predict a monkey’s intended eye or hand movements—information that could generate commands for a robotic arm or computer cursor. If the approach can be improved, it may offer people who are paralyzed a new means of controlling prostheses without equipment that penetrates the brain.

“This study will put [ultrasound] on the map as a brain-machine interface technique,” says Stanford University neuroscientist Krishna Shenoy, who was not involved in the new work. “Adding this to the toolkit is spectacular.”

Doctors have long used sound waves with frequencies beyond the range of human hearing to create images of our innards. A device called a transducer sends ultrasonic pings into the body, which bounce back to indicate the boundaries between different tissues and fluids.

About a decade ago, researchers found a way to adapt ultrasound for brain imaging. The approach, known as functional ultrasound, uses a broad, flat plane of sound instead of a narrow beam to capture a large area more quickly than with traditional ultrasound. Like functional magnetic resonance imaging (fMRI), functional ultrasound measures changes in blood flow that indicate when neurons are active and expending energy. But it creates images with much finer resolution than fMRI and doesn’t require participants to lie in a massive scanner.

The technique still requires removing a small piece of skull, but unlike implanted electrodes that read neurons’ electrical activity directly, it doesn’t involve opening the brain’s protective membrane, notes neuroscientist Richard Andersen of the California Institute of Technology (Caltech), a co-author of the new study. Functional ultrasound can read from regions deep in the brain without penetrating the tissue.

Still, gauging neural activity from a distance means sacrificing some speed and precision, says Andersen’s co-author, Caltech biochemical engineer Mikhail Shapiro. Compared with electrodes’ readings, functional ultrasound provides “a less direct signal,” he says, so “there was a question of how much information [ultrasound images] really contain.” The images could reveal neural activity as the brain prepared for a movement. But was there enough detail in that signal for a computer to decode the intended move?

To find out, the researchers slotted small ultrasound transducers, roughly the size and shape of a domino, into the skulls of two rhesus macaque monkeys. The device—attached by a wire to a computer—aimed sound waves down into a region of the brain called the posterior parietal cortex, which is involved in planning movements.

The monkeys were trained to focus their eyes on a small dot in the center of a screen while a second dot briefly flashed on the left or right. When the central dot disappeared, the animals moved their eyes to the point where the second dot had recently flashed. In another set of experiments, they reached out and moved a joystick, instead of their eyes, toward that point.

A computer algorithm then translated the ultrasound data into guesses about the monkeys’ intentions. That algorithm could determine when the animals were preparing to move and whether they were planning an eye movement or an arm reach. The scientists could predict whether a movement would be left or right with about 78% accuracy for eye movements and 89% accuracy for reaching, they report today in Neuron.

Two previous studies have used functional ultrasound data on the monkey brain to reconstruct what the animals were seeing or their eye movements. But doing that required averaging signals across long time periods or multiple movements. In the new study, the researchers collected enough data to make a prediction in each run of the experiment—each time the monkey planned a move.

That’s an important feature, says Maureen Hagan, a neuroscientist at Monash University who has studied how the brain orchestrates movement. The user of a robotic arm would want to think about their intended movement just once to get the arm moving, for example. ”You don’t want subjects to have to do many [attempted movements] to decode their intentions.”

A key next step will be to use the computer predictions in real time to guide a robot hand or a cursor, Shenoy says. He adds that functional ultrasound “has a ways to go before it can begin to approach the level of what implanted technologies [can do],” in terms of both speed and the complexity of the movements it can decode.

For example, electrode implants can already decode intended arm movements in many directions—not just left and right. But some patients might prefer a prosthesis that connects them to a computer without penetrating their brain. “It’s just so personal,” Shenoy says. “Patients want options.”

Because blood flow signals are more sluggish than electrical ones, speed is an inherent limitation of functional ultrasound, adds neuroscientist Emilie Macé of the Max Planck Institute of Neurobiology. The researchers needed data from a roughly 2-second period to decode the monkeys’ movement planning, notes Macé, who helped develop the ultrasound technology in the lab of physicist Mickael Tanter of INSERM, the French biomedical research agency—a co-author on the new study. But ultrasound could still guide a robotic arm, she says, as long as a computer could quickly direct the arm’s fine motor movements from the user’s cue.

Macé foresees many future improvements to the technique, including making it gather more information by imaging 3D chunks of tissue instead of a flat plane. “The technology is absolutely not at its full potential yet.”

 

Originally Published at sciencemag