Brain implant wirelessly translates thought to text with an incredible 94% accuracy
[Oct 12, 2022: Francis R. Willett, Stanford University]
Restoring communication: Two participants in the BrainGate clinical trial directly control a tablet computer through a brain-computer interface to chat with each other online. (CREDIT: BrainGate Collaboration)
In an important step toward a fully implantable intracortical brain-computer interface system, BrainGate researchers demonstrated the first human use of a wireless transmitter capable of delivering high-bandwidth neural signals.
Brain-computer interfaces (BCIs) are an emerging assistive technology, enabling people with paralysis to type on computer screens or manipulate robotic prostheses just by thinking about moving their own bodies. For years, investigational BCIs used in clinical trials have required cables to connect the sensing array in the brain to computers that decode the signals and use them to drive external devices.
Now, for the first time, part of a longstanding research collaboration called BrainGate – uses artificial intelligence (AI) to interpret signals of neural activity generated during handwriting.
The traditional cables are replaced by a small transmitter about 2 inches in its largest dimension and weighing a little over 1.5 ounces. The unit sits on top of a user’s head and connects to an electrode array within the brain’s motor cortex using the same port used by wired systems.
In this experiment, the man – called T5 in the study, and who was 65 years of age at the time of the research – wasn't doing any actual writing, as his hand, along with all his limbs, had been paralyzed for several years.
But during the experiment, the man concentrated as if he were writing – effectively, thinking about making the letters with an imaginary pen and paper.
Related Stories:
As he did this, electrodes implanted in his motor cortex recorded signals of his brain activity, which were then interpreted by algorithms running on an external computer, decoding T5's imaginary pen trajectories, which mentally traced the 26 letters of the alphabet and some basic punctuation marks.
"This new system uses both the rich neural activity recorded by intracortical electrodes and the power of language models that, when applied to the neurally decoded letters, can create rapid and accurate text," says first author of the study Frank Willett, a neural prosthetics researcher from Stanford University.
Similar systems developed as part of the BrainGate have been transcribing neural activity into text for several years, but many previous interfaces have focused on different cerebral metaphors for denoting which characters to write – such as point-and-click typing with a computer cursor controlled by the mind.
A diagram of how the system works. (CREDIT: F. Willett et al., Nature, 2021, Erika Woodrum)
It wasn't known, however, how well the neural representations of handwriting – a more rapid and dexterous motor skill – might be retained in the brain, nor how well they might be leveraged to communicate with a brain-computer interface, or BCI.
Here, T5 showed just how much promise a virtual handwriting system could offer for people who have lost virtually all independent physical movement.
In tests, the man was able to achieve writing speeds of 90 characters per minute (about 18 words per minute), with approximately 94 percent accuracy (and up to 99 percent accuracy with autocorrect enabled).
The man's imagined handwriting, as interpreted by the system. (CREDIT: Frank Willett)
Not only is that rate significantly faster than previous BCI experiments (using things like virtual keyboards), but it's almost on par with the typing speed of smartphone users in the man's age group – which is about 115 characters or 23 words per minute, the researchers say.
"We've learned that the brain retains its ability to prescribe fine movements a full decade after the body has lost its ability to execute those movements," Willett says.
"And we've learned that complicated intended motions involving changing speeds and curved trajectories, like handwriting, can be interpreted more easily and more rapidly by the artificial-intelligence algorithms we're using than can simpler intended motions like moving a cursor in a straight path at a steady speed."
Basically, the researchers say that alphabetical letters are very different from one another in shape, so the AI can decode the user's intention more rapidly as the characters are drawn, compared to other BCI systems that don't make use of dozens of different inputs in the same way.
Despite the potential of this first-of-its-kind technology, the researchers emphasize that the current system is only a proof of concept so far, having only been shown to work with one participant, so it's definitely not a complete, clinically viable product as yet.
The next steps in the research could include training other people to use the interface, expanding the character set to include more symbols (such as capital letters), refining the sensitivity of the system, and adding more sophisticated editing tools for the user.
There's plenty of work to still be done, but we could be looking at an exciting new development here, giving the ability to communicate back to people who lost it.
"Our results open a new approach for BCIs and demonstrate the feasibility of accurately decoding rapid, dexterous movements years after paralysis," the researchers write.
"We believe that the future of intracortical BCIs is bright."
The findings are reported in Nature.
For more science and technology stories check out our New Innovations section at The Brighter Side of News.
Note: Materials provided above by Stanford University. Content may be edited for style and length.
Like these kind of feel good stories? Get the Brighter Side of News' newsletter.
Comments