Scientists have taken a step forward in deciphering what a person is saying just by looking at her brain waves as she speaks. They were able to enable algorithms to transfer patterns from the brain in real time, with an error rate below three percent.
Previously, these so-called brain machines had limited success in decoding neuronal activity. However, the research published in the journal Nature Neuroscience brings a different picture, reports the BBC.
Previously, scientists were only able to decode fragments of spoken words or a small percentage of words contained in certain phrases. Therefore, machine learning expert Joseph Makin of the University of California and his colleagues tried to improve the accuracy of the decoding.
Four volunteers read the sentences aloud while the electrodes recorded their brain activity. Their brain activity was then entered into a computer system that created a representation of the characteristics that regularly appear in that data. These patterns are associated with repetitive features of speech, such as vowels, consonants, or commands by parts of the mouth.
The second part of the system deciphers this representation, word for word, to create sentences.
However, the authors freely acknowledge the weaknesses of the research. For example, the speech they deciphered was limited to 30-50 sentences. The researchers add that a machine connection identifies individual words, not just sentences.