The Times Australia
The Times World News

.
The Times Real Estate

.

Humanising AI could lead us to dehumanise ourselves

  • Written by Raffaele F Ciriello, Senior Lecturer in Business Information Systems, University of Sydney

Irish writer John Connolly once said[1]:

The nature of humanity, its essence, is to feel another’s pain as one’s own, and to act to take that pain away.

For most of our history, we believed empathy was a uniquely human trait – a special ability that set us apart from machines and other animals. But this belief is now being challenged.

As AI becomes a bigger part of our lives, entering even our most intimate spheres, we’re faced with a philosophical conundrum: could attributing human qualities to AI diminish our own human essence? Our research[2] suggests it can.

Digitising companionship

In recent years, AI “companion” apps such as Replika have attracted millions of users. Replika allows users to create custom digital partners to engage in intimate conversations. Members who pay for Replika Pro[3] can even turn their AI into a “romantic partner”.

Physical AI companions aren’t far behind. Companies such as JoyLoveDolls are selling interactive sex robots[4] with customisable features including breast size, ethnicity, movement and AI responses such as moaning and flirting.

While this is currently a niche market, history suggests today’s digital trends will become tomorrow’s global norms. With about one in four[5] adults experiencing loneliness, the demand for AI companions will grow.

The dangers of humanising AI

Humans have long attributed human traits to non-human entities – a tendency known as anthropomorphism. It’s no surprise we’re doing this with AI tools such as ChatGPT, which appear to “think” and “feel”. But why is humanising AI a problem?

For one thing, it allows AI companies to exploit our tendency to form attachments with human-like entities. Replika is marketed[6] as “the AI companion who cares”. However, to avoid legal issues, the company elsewhere points out Replika isn’t sentient and merely learns through millions of user interactions.

Screenshot of contradictory information on Replika's help page versus advertising
Screenshot of contradictory information on Replika’s help page versus advertising.

Some AI companies overtly claim[7] their AI assistants have empathy and can even anticipate human needs. Such claims are misleading and can take advantage of people seeking companionship. Users may become deeply emotionally invested[8] if they believe their AI companion truly understands them.

This raises serious ethical concerns. A user will hesitate[9] to delete (that is, to “abandon” or “kill”) their AI companion once they’ve ascribed some kind of sentience to it.

But what happens when said companion unexpectedly disappears, such as if the user can no longer afford it, or if the company that runs it shuts down? While the companion may not be real, the feelings attached to it are.

Empathy – more than a programmable output

By reducing empathy to a programmable output, do we risk diminishing its true essence? To answer this, let’s first think about what empathy really is.

Empathy involves responding to other people with understanding and concern. It’s when you share your friend’s sorrow as they tell you about their heartache, or when you feel joy radiating from someone you care about. It’s a profound experience – rich and beyond simple forms of measurement.

A fundamental difference between humans and AI is that humans genuinely feel emotions, while AI can only simulate them. This touches on the hard problem of consciousness[10], which questions how subjective human experiences arise from physical processes in the brain.

A child with spectacles looks closely at a monitor lizard through glass.
Science has yet to solve the hard problem of consciousness. Shutterstock

While AI can simulate understanding, any “empathy” it purports to have is a result of programming that mimics empathetic language patterns. Unfortunately, AI providers have a financial incentive to trick users into growing attached to their seemingly empathetic products.

The dehumanAIsation hypothesis

Our “dehumanAIsation hypothesis” highlights the ethical concerns that come with trying to reduce humans to some basic functions that can be replicated by a machine. The more we humanise AI, the more we risk dehumanising ourselves.

For instance, depending on AI for emotional labour could make us less tolerant of the imperfections of real relationships. This could weaken our social bonds and even lead to emotional deskilling. Future generations may become less empathetic – losing their grasp on essential human qualities as emotional skills continue to be commodified and automated.

Also, as AI companions become more common, people may use them to replace real human relationships. This would likely increase loneliness and alienation – the very issues these systems claim to help with.

AI companies’ collection and analysis of emotional data also poses significant risks, as these data could be used to manipulate users and maximise profit. This would further erode our privacy and autonomy, taking surveillance capitalism[11] to the next level.

Holding providers accountable

Regulators need to do more to hold AI providers accountable. AI companies should be honest about what their AI can and can’t do, especially when they risk exploiting users’ emotional vulnerabilities.

Exaggerated claims of “genuine empathy” should be made illegal. Companies making such claims should be fined – and repeat offenders shut down.

Data privacy policies should also be clear, fair and without hidden terms that allow companies to exploit user-generated content.

We must preserve the unique qualities that define the human experience. While AI can enhance certain aspects of life, it can’t – and shouldn’t – replace genuine human connection.

References

  1. ^ once said (www.goodreads.com)
  2. ^ research (www.researchgate.net)
  3. ^ Replika Pro (help.replika.com)
  4. ^ interactive sex robots (www.joylovedolls.com)
  5. ^ one in four (www.statista.com)
  6. ^ marketed (replika.com)
  7. ^ claim (www.space.gov.au)
  8. ^ deeply emotionally invested (theconversation.com)
  9. ^ will hesitate (www.researchgate.net)
  10. ^ hard problem of consciousness (www.researchgate.net)
  11. ^ surveillance capitalism (theconversation.com)

Read more https://theconversation.com/humanising-ai-could-lead-us-to-dehumanise-ourselves-240803

The Times Features

Fast, Fun, And Fantastic Looking Gel Polish For Your Nails!

Today's women spend a lot of time and money on their beauty and fashion regime because they love looking their very best! Looking good makes you feel good, and let's face it, it...

Energy-Efficient Roof Restoration Trends to Watch in Sydney

As climate consciousness rises and energy costs soar, energy-efficient roof restoration has become a significant focus in Sydney. Whether you're renovating an old roof or enhan...

Brisbane Water Bill Savings: Practical Tips to Reduce Costs

Brisbane residents have been feeling the pinch as water costs continue to climb. With increasing prices, it's no wonder many households are searching for ways to ease the burde...

Exploring Hybrid Heating Systems for Modern Homes

Consequently, energy efficiency as well as sustainability are two major considerations prevalent in the current market for homeowners and businesses alike. Hence, integrated heat...

Are Dental Implants Right for You? Here’s What to Think About

Dental implants are now among the top solutions for those seeking to replace and improve their teeth. But are dental implants suitable for you? Here you will find out more about ...

Sunglasses don’t just look good – they’re good for you too. Here’s how to choose the right pair

Australians are exposed to some of the highest levels[1] of solar ultraviolet (UV) radiation in the world. While we tend to focus on avoiding UV damage to our skin, it’s impor...

Times Magazine

How to Analyze and Repair Complex Non-Volatile Memory Failures: Advanced Techniques for Handling NAND Flash Degradation

Non-volatile memory is the unsung hero of our digital world, quietly storing crucial data even when power is lost. But what happens when this silent guardian begins to fail? For laptop users, understanding and addressing complex NAND flash degradat...

Property app Instarent

Property self-management soars during COVID lockdown The innovative PropTech app, Instarent, has seen exponential growth during the COVID -19 lockdown, reporting a 400 per cent increase in users during March/April 2020. These figures indicate ...

Push notification provider wizardry is where imagination meets conversion

To succeed in today's hectic, digital environment, good communication is crucial. Businesses nowadays are always looking for new ways to get people interested, connected, and motivated. Push notifications have evolved as a valuable tool in an ever-ch...

Racer Holly Espray hits the track with Uniden for V8 SuperUte Series in Bathurst

Leading SuperUte racer Holly Espray is geared up for her next big challenge at Bathurst, and she's relying on support from her new sponsor Uniden, known for its cutting-edge technology, to keep her connected and secure, both on and off the track. ...

Harnessing Modern Technology for Sustainable Solutions: A Blueprint for the Future

In an era where sustainability is not just a buzzword but a critical imperative, the convergence of digital, cloud, data, and intelligence offers unprecedented opportunities to create a more sustainable world. Organizations across the globe are lev...

Beet Pulp as a Holistic Treatment for Horse Ulcers

Equine ulcers are a common problem in horses that can cause a lack of stomach acid protection resulting in erosive damage to the stomach lining. Stressful situations such as competition, travel, changes in diet and environment, illness and injury c...

LayBy Shopping