A pair of new experiments helped people with paralysis communicate audibly in close to real time.
In 2005, Ann Johnson suffered a stroke that left her severely paralyzed and unable to speak. She was 30.
At best, she could make sounds like “ooh” and “ah,” but her brain was still firing off signals.
Now, in a scientific milestone 18 years after Johnson’s stroke, an experimental technology has translated her brain signals into audible words, enabling her to communicate through a digital avatar.
The technology, developed by researchers at the University of California, San Francisco, and the University of California, Berkeley, relies on an implant placed on the surface of Johnson’s brain in regions associated with speech and language.
The implant, which Johnson received in an operation last year, contains 253 electrodes that intercept brain signals from thousands of neurons. During the surgery, doctors also installed a port in Johnson’s head that connects to a cable, which carries her brain signals to a computer bank.
The computers use artificial intelligence algorithms to translate the brain signals into sentences that get spoken through a digitally animated figure. So when Johnson tried to say a sentence like “Great to see you again,” the avatar on a nearby screen uttered those words out loud.
The system appears to be significantly faster and more accurate than previous technologies that attempted similar feats, and it allowed Johnson to communicate using a relatively expansive vocabulary.
The researchers used a recording of Johnson speaking at her wedding to personalize the avatar’s voice. The system also converted Johnson’s brain signals into facial movements on the avatar, such as pursed lips, and emotional expressions, like sadness or surprise.
The results of the experiment were published Wednesday in the journal Nature.
Dr. Edward Chang, an author of study who performed Johnson’s surgery, said he was “absolutely thrilled” to watch her communicate through the avatar.
“There’s nothing that can convey how satisfying it is to see something like this actually work in real time,” Chang, the chair of neurological surgery at UCSF, said at a news briefing.
The technology converted Johnson’s speech attempts into words at nearly 80 words per minute. Chang said the natural rate of speech is around 150 to 200. It had a median accuracy of around 75% when Johnson used a 1,024-word vocabulary.
In a feedback survey, Johnson wrote that she was emotional upon hearing the avatar speak in a voice similar to hers.
“The first 7 years after my stroke, all I used was a letterboard. My husband was so sick of having to get up and translate the letterboard for me,” she wrote.
Going into the study, she said, her moonshot goal was to become a counselor and use the technology to talk to clients.
“I think the avatar would make them more at ease,” she wrote.
The technology isn’t wireless, however, so it hasn’t yet advanced enough to integrate into Johnson’s daily life.