The Times Australia
Google AI
The Times World News

.

Our neurodata can reveal our most private selves. As brain implants become common, how will it be protected?

  • Written by Christina Maher, Researcher, University of Sydney
Our neurodata can reveal our most private selves. As brain implants become common, how will it be protected?

“Hello world!”

On December 2021, these were the first words tweeted[1] by a paralysed man using only his thoughts and a brain-computer interface (BCI) implanted by the company Synchron.

For millions living with paralysis, epilepsy and neuromuscular conditions, BCIs offer restored movement and, more recently, thought-to-text capabilities.

So far, few invasive (implanted) versions of the technology have been commercialised[2]. But a number of companies are determined to change this.

Synchron is joined by Elon Musk’s Neuralink, which has documented a monkey playing the computer game Pong[3] using its BCI – as well as the newer Precision Neuroscience[4], which recently raised[5] US$41 million towards building a reversible implant thinner than a human hair.

Eventually, BCIs will allow people to carry out a range of tasks using their thoughts. But is this terrific, or terrifying?

How do BCIs work?

BCIs can be non-invasive (wearable) or invasive (implanted). Electrical activity is the most commonly captured “neurodata”, with invasive BCIs providing better signal quality than non-invasive ones.

The functionality of most BCIs can be summarised as passive, active and reactive. All BCIs use signal processing[6] to filter brain signals. After processing, active and reactive BCIs can return outputs in response to a user’s voluntary brain activity.

Signals from specific brain regions are considered a combination of many tiny signals from multiple regions. So BCIs use pattern recognition algorithms[7] to decipher a signal’s potential origins and link it to an intentional event, such as a task or thought.

One of the first implanted BCIs[8] treated drug-resistant seizures in some of the 50 million people with epilepsy. And ongoing clinical trials signal[9] a new era for neurologically and physically impaired people.

Outside the clinical realm, however, neurodata exist in a largely unregulated space.

Read more: Elon Musk claims his Neuralink brain chip could 'cure' tinnitus in 5 years. But don't hold your breath[10]

An unknown middleman

In human interaction, thoughts are interpreted by the person experiencing and communicating them, and separately by the person receiving the communication. In this sense, allowing algorithms to interpret our thoughts could be likened to another entity “speaking” for us.

This could raise issues in a future where thought-to-text is widespread. For example, a BCI may generate the output “I’m good”, when the user intended it to be “I’m great”. These are similar, but they aren’t the same. It’s easy enough for an able-bodied person to physically correct the mistake – but for people who can only communicate through BCIs, there’s a risk of being misinterpreted.

Moreover, implanted BCIs can provide rich access to all brain signals; there is no option to pick and choose which signals are shared.

Brain data are arguably our most private data because of what can be inferred regarding our identity and mental state. Yet private BCI companies may not need to inform users[11] about what data are used to train algorithms, or how the data are linked to interpretations that lead to outputs.

In Australia, strict data storage rules[12] require that all BCI-related patient data are stored on secure servers in a de-identified form, which helps protect patient privacy. But requirements outside of a research context are unclear.

What’s at risk if neurodata aren’t protected?

BCIs are unlikely to launch us into a dystopian world – in part due to current computational constraints. After all, there’s a leap between a BCI sending a short text and interpreting one’s entire stream of consciousness.

That said, making this leap largely comes down to how well we can train algorithms, which requires more data and computing power. The rise of quantum computing[13] – whenever that may be – could provide these additional computational resources.

Current BCIs aren’t advanced enough to quickly and reliably interpret a stream of thoughts — but a growth in computational power may allow this in the future. Shutterstock

Cathy O'Neil’s 2016 book, Weapons of Math Destruction[14], highlights how algorithms that measure complex concepts such as human qualities could let predatory entities make important decisions for the most vulnerable people.

Here are some hypothetical worst-case scenarios.

  1. Third-party companies might buy neurodata from BCI companies and use it to make decisions, such as whether someone is granted a loan or access to health care.

  2. Courts might be allowed to order neuromonitoring[15] of individuals with the potential to commit crimes, based on their previous history or socio-demographic environment.

  3. BCIs specialised for “neuroenhancement” could be made a condition of employment, such as in the military[16]. This would blur the boundaries between human reasoning and algorithmic influence.

  4. As with all industries where data privacy is critical, there is a genuine risk of neurodata hacking, where cybercriminals access and exploit brain data.

Then there are subtler examples, including the potential for bias. In the future, bias may be introduced into BCI technologies in a number of ways, including through:

  • the selection of homogeneous training data

  • a lack of diversity among clinical trial participants (especially in control groups)

  • a lack of diversity in the teams that design the algorithms and software.

If BCIs are to cater to diverse users, then diversity will need to be factored into every stage of development.

How can we protect neurodata?

The vision for “neurorights[17]” is an evolving space. The ethical challenges lie in the balance between choosing what is best for individuals and what is best for society at large.

For instance, should individuals in the military be equipped with neuroenhancing devices so they can better serve their country and protect themselves on the front lines, or would that compromise their individual identity and privacy? And which legislation should capture neurorights: data protection law, health law, consumer law, or criminal law?

In a world first, Chile[18] passed a neurorights law in 2021 to protect mental privacy, by explicitly classifying mental data and brain activity as a human right to be legally protected. Though a step in the right direction, it remains unclear how such a law would be enforced.

One US-based patient group is taking matters into its own hands. The BCI Pioneers[19] is an advocate group ensuring the conversation around neuroethics is patient-led.

Other efforts include the Neurorights Foundation[20], and the proposal of a “technocratic oath[21]” modelled on the Hippocratic oath taken by medical doctors. An International Organisation for Standardisation committee[22] for BCI standards is also under way.

Read more: Neuralink put a chip in Gertrude the pig's brain. It might be useful one day[23]

References

  1. ^ first words tweeted (www.businesswire.com)
  2. ^ commercialised (www.neuropace.com)
  3. ^ monkey playing the computer game Pong (theconversation.com)
  4. ^ Precision Neuroscience (precisionneuro.io)
  5. ^ recently raised (www.globenewswire.com)
  6. ^ signal processing (theconversation.com)
  7. ^ pattern recognition algorithms (recfaces.com)
  8. ^ first implanted BCIs (www.neuropace.com)
  9. ^ signal (jamanetwork.com)
  10. ^ Elon Musk claims his Neuralink brain chip could 'cure' tinnitus in 5 years. But don't hold your breath (theconversation.com)
  11. ^ may not need to inform users (fpf.org)
  12. ^ data storage rules (www.nhmrc.gov.au)
  13. ^ quantum computing (www.ncbi.nlm.nih.gov)
  14. ^ Weapons of Math Destruction (blogs.scientificamerican.com)
  15. ^ order neuromonitoring (link.springer.com)
  16. ^ military (theconversation.com)
  17. ^ neurorights (www.frontiersin.org)
  18. ^ Chile (neurorightsfoundation.org)
  19. ^ BCI Pioneers (www.bcipioneers.org)
  20. ^ Neurorights Foundation (neurorightsfoundation.org)
  21. ^ technocratic oath (link.springer.com)
  22. ^ committee (www.iso.org)
  23. ^ Neuralink put a chip in Gertrude the pig's brain. It might be useful one day (theconversation.com)

Read more https://theconversation.com/our-neurodata-can-reveal-our-most-private-selves-as-brain-implants-become-common-how-will-it-be-protected-197047

Times Magazine

With Nvidia’s second-best AI chips headed for China, the US shifts priorities from security to trade

This week, US President Donald Trump approved previously banned exports[1] of Nvidia’s powerful ...

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

As Earth continues to warm, Australia faces some important decisions. For example, where shou...

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

The Times Features

I’m heading overseas. Do I really need travel vaccines?

Australia is in its busiest month[1] for short-term overseas travel. And there are so many thi...

Mint Payments partners with Zip Co to add flexible payment options for travel merchants

Mint Payments, Australia's leading travel payments specialist, today announced a partnership with ...

When Holiday Small Talk Hurts Inclusion at Work

Dr. Tatiana Andreeva, Associate Professor in Management and Organisational Behaviour, Maynooth U...

Human Rights Day: The Right to Shelter Isn’t Optional

It is World Human Rights Day this week. Across Australia, politicians read declarations and clai...

In awkward timing, government ends energy rebate as it defends Wells’ spendathon

There are two glaring lessons for politicians from the Anika Wells’ entitlements affair. First...

Australia’s Coffee Culture Faces an Afternoon Rethink as New Research Reveals a Surprising Blind Spot

Australia’s celebrated coffee culture may be world‑class in the morning, but new research* sugge...

Reflections invests almost $1 million in Tumut River park to boost regional tourism

Reflections Holidays, the largest adventure holiday park group in New South Wales, has launched ...

Groundbreaking Trial: Fish Oil Slashes Heart Complications in Dialysis Patients

A significant development for patients undergoing dialysis for kidney failure—a group with an except...

Worried after sunscreen recalls? Here’s how to choose a safe one

Most of us know sunscreen is a key way[1] to protect areas of our skin not easily covered by c...