The Times Australia
The Times World News

.
The Times Real Estate

.

How close are we to reading minds? A new study decodes language and meaning from brain scans

  • Written by Christina Maher, Computational Neuroscientist and Biomedical Engineer, University of Sydney
How close are we to reading minds? A new study decodes language and meaning from brain scans

The technology to decode our thoughts is drawing ever closer. Neuroscientists at the University of Texas have for the first time decoded data from non-invasive brain scans and used them to reconstruct language and meaning from stories that people hear, see or even imagine.

In a new study published in Nature Neuroscience[1], Alexander Huth and colleagues successfully recovered the gist of language and sometimes exact phrases from functional magnetic resonance imaging[2] (fMRI) brain recordings of three participants.

Technology that can create language from brain signals could be enormously useful for people who cannot speak due to conditions such as motor neurone disease[3]. At the same time, it raises concerns for the future privacy of our thoughts.

Language decoded

Language decoding models[4], also called “speech decoders”, aim to use recordings of a person’s brain activity to discover the words they hear, imagine or say.

Until now, speech decoders have only been used with data from devices surgically implanted in the brain, which limits their usefulness. Other decoders which used non-invasive brain activity recordings have been able to decode single words or short phrases, but not continuous language.

Read more: We've been connecting brains to computers longer than you’d expect. These 3 companies are leading the way[5]

The new research used the blood oxygen level dependent signal[6] from fMRI scans, which shows changes in blood flow and oxygenation levels in different parts of the brain. By focusing on patterns of activity in brain regions and networks that process language, the researchers found their decoder could be trained to reconstruct continuous language (including some specific words and the general meaning of sentences).

Specifically, the decoder took the brain responses of three participants as they listened to stories, and generated sequences of words that were likely to have produced those brain responses. These word sequences did well at capturing the general gist of the stories, and in some cases included exact words and phrases.

The researchers also had the participants watch silent movies and imagine stories while being scanned. In both cases, the decoder often managed to predict the gist of the stories.

For example, one user thought “I don’t have my driver’s licence yet”, and the decoder predicted “she has not even started to learn to drive yet”.

Further, when participants actively listened to one story while ignoring another story played simultaneously, the decoder could identify the meaning of the story being actively listened to.

How does it work?

The researchers started out by having each participant lie inside an fMRI scanner and listen to 16 hours of narrated stories while their brain responses were recorded.

These brain responses were then used to train an encoder[7] – a computational model that tries to predict how the brain will respond to words a user hears. After training, the encoder could quite accurately predict how each participant’s brain signals would respond to hearing a given string of words.

However, going in the opposite direction – from recorded brain responses to words – is trickier.

The encoder model is designed to link brain responses with “semantic features” or the broad meanings of words and sentences. To do this, the system uses the original GPT language model[8], which is the precursor of today’s GPT-4 model. The decoder then generates sequences of words that might have produced the observed brain responses.

A table showing stills from an animated film next to descriptions of the action decoded from fMRI scans.
The decoder could also describe the action when participants watched silent movies. Tang et al. / Nature Neuroscience[9]

The accuracy of each “guess” is then checked by using it to predict previously recorded brain activity, with the prediction then compared to the actual recorded activity.

During this resource-intensive process, multiple guesses are generated at a time, and ranked in order of accuracy. Poor guesses are discarded and good ones kept. The process continues by guessing the next word in the sequence, and so on until the most accurate sequence is determined.

Words and meanings

The study found data from multiple, specific brain regions – including the speech network, the parietal-temporal-occipital association region, and prefrontal cortex – were needed for the most accurate predictions.

One key difference between this work and earlier efforts is the data being decoded. Most decoding systems link brain data to motor features or activity recorded from brain regions involved in the last step of speech output, the movement of the mouth and tongue. This decoder works instead at the level of ideas and meanings.

One limitation of using fMRI data is its low “temporal resolution”. The blood oxygen level dependent signal rises and falls over approximately a 10-second period, during which time a person might have heard 20 or more words. As a result, this technique cannot detect individual words, but only the potential meanings of sequences of words.

No need for privacy panic (yet)

The idea of technology that can “read minds” raises concerns over mental privacy. The researchers conducted additional experiments to address some of these concerns.

These experiments showed we don’t need to worry just yet about having our thoughts decoded while we walk down the street, or indeed without our extensive cooperation.

A decoder trained on one person’s thoughts performed poorly when predicting the semantic detail from another participant’s data. What’s more, participants could disrupt the decoding by diverting their attention to a different task such as naming animals or telling a different story.

Read more: Our neurodata can reveal our most private selves. As brain implants become common, how will it be protected?[10]

Movement in the scanner can also disrupt the decoder as fMRI is highly sensitive to motion, so participant cooperation is essential. Considering these requirements, and the need for high-powered computational resources, it is highly unlikely that someone’s thoughts could be decoded against their will at this stage.

Finally, the decoder does not currently work on data other than fMRI, which is an expensive and often impractical procedure. The group plans to test their approach on other non-invasive brain data in the future.

Read more https://theconversation.com/how-close-are-we-to-reading-minds-a-new-study-decodes-language-and-meaning-from-brain-scans-204691

The Times Features

Brisbane Homeowners Warned: Non-Compliant Flexible Hoses Pose High Flood Risk

As a homeowner in Brisbane, when you think of the potential for flood damage to your home, you probably think of weather events. But you should know that there may be a tickin...

Argan Oil-Infused Moroccanoil Shampoo: Nourish and Revitalize Your Hair

Are you ready to transform your hair from dull and lifeless to vibrant and full of life? Look no further than the luxurious embrace of Argan Oil-Infused Moroccanoil Shampoo! In a...

Building A Strong Foundation For Any Structure

Building a home or commercial building can be very exciting. The possibilities are endless and the future is interesting. You can always change aspects of the building to meet the ...

The Role of a Family Dentist: Why Every Household Needs One

source A family dentist isn’t like your regular dentist who may specialise in a particular age group and whom you visit only when something goes wrong. A family dentist takes proa...

Benefits of Getting an Online Medical Certificate

Everyone has experienced it. Rather than taking a break, you drag yourself to the doctor's office, where you have to wait in lengthy lines, and then you have to hurry to get that...

10 Must-See Townsville Spots with Car Hire

Key Highlights Explore Townsville and its surrounding areas with ease by opting for a car hire upon your arrival at Townsville Airport. From the vibrant waterfront of The Str...

Times Magazine

"Eternal Nurture" by Cara Barilla: A Timeless Collection of Wisdom and Healing

Renowned Sydney-born author and educator Cara Barilla has released her latest book, Eternal Nurture, a profound collection of inspirational quotes designed to support mindfulness, emotional healing, and personal growth. With a deep commitment to ...

How AI-Driven SEO Enhancements Can Improve Headless CMS Content Visibility

Whereas SEO (search engine optimization) is critical in the digital landscape for making connections to content, much of it is still done manually keyword research, metatags, final tweaks at publication requiring a human element that takes extensiv...

Crypto Expert John Fenga Reveals How Blockchain is Revolutionising Charity

One of the most persistent challenges in the charity sector is trust. Donors often wonder whether their contributions are being used effectively or if overhead costs consume a significant portion. Traditional fundraising methods can be opaque, with...

Navigating Parenting Arrangements in Australia: A Legal Guide for Parents

Understanding Parenting Arrangements in Australia. Child custody disputes are often one of the most emotionally charged aspects of separation or divorce. Parents naturally want what is best for their children, but the legal process of determining ...

Blocky Adventures: A Minecraft Movie Celebration for Your Wrist

The Minecraft movie is almost here—and it’s time to get excited! With the film set to hit theaters on April 4, 2025, fans have a brand-new reason to celebrate. To honor the upcoming blockbuster, watchfaces.co has released a special Minecraft-inspir...

The Ultimate Guide to Apple Watch Faces & Trending Wallpapers

In today’s digital world, personalization is everything. Your smartwatch isn’t just a timepiece—it’s an extension of your style. Thanks to innovative third-party developers, customizing your Apple Watch has reached new heights with stunning designs...

LayBy Shopping