Although it is used for medical purposes related to those who have lost the ability to speak, this new device raises questions about the violation of “mental privacy”, according to the authors of the study, the results of which were published in the journal “Nature Neuroscience”.
Scientists have developed a decoder that allows, through brain imaging and artificial intelligence, to translate a person’s thoughts into language without speaking, according to a study published Monday.
The main goal of this “language decoder” is to help patients who have lost the ability to speak to communicate their thoughts via a computer. Although it is used for medical purposes, this new device raises questions about the violation of “mental privacy”, according to the authors of the study, the results of which were published in the journal “Nature Neuroscience”.
To deflect criticism, the researchers pointed out that their tool only works after training the brain by spending long hours in an MRI machine.
Previous brain-machine interfaces, devices intended to allow people with significant disabilities to regain autonomy, have proven useful. One such interface demonstrated the ability to translate sentences from a paralyzed person who is unable to speak or type on a keyboard.
But these devices are invasive, with electrodes implanted in the brain, focusing only on the areas of the brain that control the mouth to form words.
“Our system works at the level of ideas, semantics and meaning,” said neuroscientist at the University of Austin in Texas Alexander Huth, co-author of the study, during a press conference.
During the experiment, three people spent 16 hours in a functional medical imaging (fMRI) machine. This technology makes it possible to record differences in cerebral blood flow, thus reporting in real time the activity of brain regions during certain tasks (speech, movement, etc.).
Researchers played podcasts in which stories were told. This allowed the researchers to determine how words and sentences and their meanings stimulate different regions of the brain.
The study authors then fed that data into a neural network for artificial language processing using the artificial intelligence program GPT-1, the predecessor to the popular ChatGPT bot.
The network was trained to predict how each brain would react to the speech being heard. Then each person listened to a new story inside the fMRI machine, to test whether the network guessed correctly.
“deeper than language”
As a result, despite often paraphrasing or changing the word order, the decoder was able to “reconstruct the meaning of what the person heard,” said Jerry Tang of the University of Austin, lead author of the study.
For example, when a user heard “I don’t have a driver’s license yet,” the network model replied, “She hasn’t even started learning to drive yet.” The experiment went further: even when participants imagined their own stories or watched silent films, the set-top box was able to capture the essence of their thoughts.
These findings indicate that “we are decoding something deeper than language and then turning it into language,” says Alexander Huth.
David Rodriguez Arias Filhen, a professor of bioethics at the University of Granada, Spain, who was not involved in the study, considered these findings a real advance compared to previous brain-machine interfaces.
“These results bring us closer to a future in which machines will be able to read minds and write down thoughts,” Villhen said. But he warned that this could happen against the will of people, for example when they are asleep, thus endangering our freedom in the future.
The study authors anticipated these dangers by proving that the decoder does not work on the brain of someone it has not been trained on.
The three participants were also able to fool the machine easily: while listening to a podcast, they had to count to seven and imagine animals and name them or tell another story in their heads… Many tactics “sabotaged” the work of the decoder.
However, the study’s authors called for rules aimed at protecting privacy.