Advertisements
A mind-reading gadget that can interpret brain activity, predict what you want to say before you even say a word - and produce a transcript in real time

Mind-reading gadget predicts what you want to say before you say it for the first time – and the breakthrough can help people with strokes and motor neuron disease communicate

  • Experts scan participants' brains while listening to and answering questions
  • They used this to make a decoder that can interpret speech of brain activity
  • Both heard and actively produced speech can be decoded from cortical scans
  • One day the device can help people who cannot speak due to illness or injury
Advertisements

A new thought-reading gadget can interpret brain activity and predict what you want to say before you even say a word – and even produce a transcript in real time.

Advertisements

Researchers used brain scans of people who listened to and then answered questions to develop a system to decode speech from the corresponding brain activity.

After dubbed the neural decoder, once refined it can be used by people who can't talk or their own because of illness or injury.

Scroll down for video

A mind-reading gadget that can interpret brain activity, predict what you want to say before you even say a word - and produce a transcript in real time

A mind-reading gadget that can interpret brain activity, predict what you want to say before you even say a word – and produce a transcript in real time

Advertisements

Neurosurgeon Edward Chang of the University of San Francisco in California and colleagues registered cortical activity in the brains of three patients who had undergone treatment for epilepsy.

The test subjects each listened to a series of questions and responded orally with the help of a series of previously determined set answers.

In this way, researchers were able to collect data on brain activity that corresponds to both perceived and produced speech.

This data was then used to train a system that can detect and decode speech from brain scans.

In a subsequent test, participants were asked to listen to a series of questions and to answer aloud with an answer of their own choice.

Advertisements

The researchers took further cortical scans during this process.

Based on this alone, the researchers were able to use the previously developed brain decoding model to not only detect when the participants were listening or speaking, but also to predict what was being heard or said.

Finally, Dr. Chang and his colleagues both spoken and heard speech with 61 and 76 percent decoding respectively.

The team discovered that they could improve the accuracy of their decoded answer based on their translation of scans that correspond to the first question – because certain answers were only valid in response to specific questions.

This is not the first study to show that speech-related brain activity can be decoded in specific areas of the cortex – but this is the first study to simultaneously address the interpretation of both listening and speaking assignments.

Advertisements

The researchers hope that in the future mind-readers based on their technology can be used to support communication with people who cannot speak solely due to illness or injury.

To do this, however, it must be proven that the same principle can be applied to decode answers from purely imagined – instead of actually produced – speech.

The full findings of the study were published in the journal Nature communication.

HUMAN BRAINS WILL CONNECT WITH COMPUTERS WITHIN DECADS

In a new article published in the Frontiers in Neuroscience, researchers started an international collaboration that pioneered developments in the world of & # 39; Human Brain / Cloud Interface & # 39; s & # 39; in the coming decades.

Using a combination of nanotechnology, artificial intelligence and other more traditional informatics, researchers say that people can seamlessly connect their brains to a cloud of computer (s) to collect information from the internet in real time.

Advertisements

According to Robert Freitas Jr., senior research author, a fleet of nanobots embedded in our brains would act as contacts with the mind and supercomputers of humans, to download information in & # 39; matrix style & # 39; possible.

& # 39; These devices would navigate through the human vasculature, cross the blood-brain barrier, and position themselves exactly between, or even in, brain cells, & # 39 ;, Freitas explains.

& # 39; They would then send wirelessly encrypted information to and from a cloud-based supercomputer network for real-time brain status monitoring and data extraction. & # 39;

The interfaces would not only stop connecting people and computers, researchers say. A brain network could also help shape what they call a & # 39; globally superrein & # 39; mention that makes collective thinking possible.

Advertisements

. (TagsToTranslate) Dailymail (t) sciencetech