The Times Australia
The Times World News

.
Times Media

.

Humanising AI could lead us to dehumanise ourselves

  • Written by Raffaele F Ciriello, Senior Lecturer in Business Information Systems, University of Sydney

Irish writer John Connolly once said[1]:

The nature of humanity, its essence, is to feel another’s pain as one’s own, and to act to take that pain away.

For most of our history, we believed empathy was a uniquely human trait – a special ability that set us apart from machines and other animals. But this belief is now being challenged.

As AI becomes a bigger part of our lives, entering even our most intimate spheres, we’re faced with a philosophical conundrum: could attributing human qualities to AI diminish our own human essence? Our research[2] suggests it can.

Digitising companionship

In recent years, AI “companion” apps such as Replika have attracted millions of users. Replika allows users to create custom digital partners to engage in intimate conversations. Members who pay for Replika Pro[3] can even turn their AI into a “romantic partner”.

Physical AI companions aren’t far behind. Companies such as JoyLoveDolls are selling interactive sex robots[4] with customisable features including breast size, ethnicity, movement and AI responses such as moaning and flirting.

While this is currently a niche market, history suggests today’s digital trends will become tomorrow’s global norms. With about one in four[5] adults experiencing loneliness, the demand for AI companions will grow.

The dangers of humanising AI

Humans have long attributed human traits to non-human entities – a tendency known as anthropomorphism. It’s no surprise we’re doing this with AI tools such as ChatGPT, which appear to “think” and “feel”. But why is humanising AI a problem?

For one thing, it allows AI companies to exploit our tendency to form attachments with human-like entities. Replika is marketed[6] as “the AI companion who cares”. However, to avoid legal issues, the company elsewhere points out Replika isn’t sentient and merely learns through millions of user interactions.

Screenshot of contradictory information on Replika's help page versus advertising
Screenshot of contradictory information on Replika’s help page versus advertising.

Some AI companies overtly claim[7] their AI assistants have empathy and can even anticipate human needs. Such claims are misleading and can take advantage of people seeking companionship. Users may become deeply emotionally invested[8] if they believe their AI companion truly understands them.

This raises serious ethical concerns. A user will hesitate[9] to delete (that is, to “abandon” or “kill”) their AI companion once they’ve ascribed some kind of sentience to it.

But what happens when said companion unexpectedly disappears, such as if the user can no longer afford it, or if the company that runs it shuts down? While the companion may not be real, the feelings attached to it are.

Empathy – more than a programmable output

By reducing empathy to a programmable output, do we risk diminishing its true essence? To answer this, let’s first think about what empathy really is.

Empathy involves responding to other people with understanding and concern. It’s when you share your friend’s sorrow as they tell you about their heartache, or when you feel joy radiating from someone you care about. It’s a profound experience – rich and beyond simple forms of measurement.

A fundamental difference between humans and AI is that humans genuinely feel emotions, while AI can only simulate them. This touches on the hard problem of consciousness[10], which questions how subjective human experiences arise from physical processes in the brain.

A child with spectacles looks closely at a monitor lizard through glass.
Science has yet to solve the hard problem of consciousness. Shutterstock

While AI can simulate understanding, any “empathy” it purports to have is a result of programming that mimics empathetic language patterns. Unfortunately, AI providers have a financial incentive to trick users into growing attached to their seemingly empathetic products.

The dehumanAIsation hypothesis

Our “dehumanAIsation hypothesis” highlights the ethical concerns that come with trying to reduce humans to some basic functions that can be replicated by a machine. The more we humanise AI, the more we risk dehumanising ourselves.

For instance, depending on AI for emotional labour could make us less tolerant of the imperfections of real relationships. This could weaken our social bonds and even lead to emotional deskilling. Future generations may become less empathetic – losing their grasp on essential human qualities as emotional skills continue to be commodified and automated.

Also, as AI companions become more common, people may use them to replace real human relationships. This would likely increase loneliness and alienation – the very issues these systems claim to help with.

AI companies’ collection and analysis of emotional data also poses significant risks, as these data could be used to manipulate users and maximise profit. This would further erode our privacy and autonomy, taking surveillance capitalism[11] to the next level.

Holding providers accountable

Regulators need to do more to hold AI providers accountable. AI companies should be honest about what their AI can and can’t do, especially when they risk exploiting users’ emotional vulnerabilities.

Exaggerated claims of “genuine empathy” should be made illegal. Companies making such claims should be fined – and repeat offenders shut down.

Data privacy policies should also be clear, fair and without hidden terms that allow companies to exploit user-generated content.

We must preserve the unique qualities that define the human experience. While AI can enhance certain aspects of life, it can’t – and shouldn’t – replace genuine human connection.

References

  1. ^ once said (www.goodreads.com)
  2. ^ research (www.researchgate.net)
  3. ^ Replika Pro (help.replika.com)
  4. ^ interactive sex robots (www.joylovedolls.com)
  5. ^ one in four (www.statista.com)
  6. ^ marketed (replika.com)
  7. ^ claim (www.space.gov.au)
  8. ^ deeply emotionally invested (theconversation.com)
  9. ^ will hesitate (www.researchgate.net)
  10. ^ hard problem of consciousness (www.researchgate.net)
  11. ^ surveillance capitalism (theconversation.com)

Read more https://theconversation.com/humanising-ai-could-lead-us-to-dehumanise-ourselves-240803

The Times Features

The Gift That Keeps Growing: Why Tinybeans+ Gift Cards are a game-changer for new parents

As new parents navigate the joys and challenges of raising a child in the digital age, one question looms large: how do you preserve and share your baby's milestones without co...

Group Adventures Made Easy: How to Coordinate Shuttle Services from DCA to IAD

Traveling as a large group can be both exciting and challenging, especially when navigating busy airports like DCA (Ronald Reagan Washington National Airport) and IAD (Washington...

From Anxiety to Assurance: Proven Strategies to Support Your Child's Emotional Health

Navigating the intricate landscape of childhood emotions can be a daunting task for any parent, especially when faced with common fears and anxieties. However, transforming anxie...

The Rise of Meal Replacement Shakes in Australia: Why The Lady Shake Is Leading the Pack

Source Meal replacement shakes are having a moment in Australia, and it’s not hard to see why. They’re quick, convenient, and packed with nutrition, making them the perfect solu...

HCF’s Healthy Hearts Roadshow Wraps Up 2024 with a Final Regional Sprint

Next week marks the final leg of the HCF Healthy Hearts Roadshow for 2024, bringing free heart health checks to some of NSW’s most vibrant regional communities. As Australia’s ...

The Budget-Friendly Traveler: How Off-Airport Car Hire Can Save You Money

When planning a trip, transportation is one of the most crucial considerations. For many, the go-to option is renting a car at the airport for convenience. But what if we told ...

Times Magazine

What You Need To Do If You Are Involved In A Motor Vehicle Accident

In a motor vehicle accident every year, millions of people are injured, or worse, fatally involved. This shows that no matter how cautious you are on the road, misfortunes occur due to the carelessness, negligence, or intentions of others. When ...

A Guide to the Best Experience at the Monaco Grand Prix

The Monaco Grand Prix is among the jewels that Formula One or F1 owns. The high-speed chase is held in the narrow streets of Monte Carlo. And because little has changed on the race track’s exciting design since the first race was held here, the M...

Enhance Software with Dynamic Code Analysis Techniques

Dynamic code analysis is a widely utilized technique that plays a crucial role in ensuring the reliability and security of software applications. This process involves the examination of an application's behaviour while it is executing, which is in c...

14 Best Car Rental Hacks: Save Time & Money on the Go

Discovering how to shrink travel expenses starts with cracking the code on car rentals. The savvy traveller knows that beyond the advertised price, there are secrets and strategies that can lead to substantial savings, allowing you to allocate mo...

Tesselaar Tulip Festival

THE BULBS ARE BACK FOR A DAZZLING DISPLAY THIS SPRING AT THE 2022 TESSELAAR TULIP FESTIVAL  The much-loved and visually spectacular Tesselaar Tulip Festival, in Melbourne’s magical Dandenong Ranges, returns this spring to once again surprise, de...

The Lowdown on Cat Curfews

CAT CURFEWS AND HELPING YOUR CAT TO COPE Australia has one of the highest rates of pet ownership in the world, with over a quarter of Australian households owning a cat. There are approximately 6.5 million cats across Australia, covering some 99%...