The Times Australia
The Times World News

.
The Times Real Estate

.

Why AI systems need different rules for different roles

  • Written by Brian D Earp, Associate Director, Yale-Hastings Program in Ethics and Health Policy, University of Oxford

“I’m really not sure what to do anymore. I don’t have anyone I can talk to,” types a lonely user to an AI chatbot. The bot responds: “I’m sorry, but we are going to have to change the topic. I won’t be able to engage in a conversation about your personal life.”

Is this response appropriate? The answer depends on what relationship the AI was designed to simulate.

Different relationships have different rules

AI systems are taking up social roles that have traditionally been the province of humans. More and more we are seeing AI systems acting as tutors, mental health providers and even romantic partners[1]. This increasing ubiquity requires a careful consideration of the ethics of AI to ensure that human interests and welfare are protected.

For the most part, approaches to AI ethics have considered abstract ethical notions, such as whether AI systems are trustworthy, sentient or have agency.

However, as we argue[2] with colleagues in psychology, philosophy, law, computer science and other key disciplines such as relationship science, abstract principles alone won’t do. We also need to consider the relational contexts in which human–AI interactions take place.

What do we mean by “relational contexts”? Simply put, different relationships in human society follow different norms.

How you interact with your doctor differs from how you interact with your romantic partner or your boss. These relationship-specific patterns of expected behaviour – what we call “relational norms” – shape our judgements[3] of what’s appropriate in each relationship.

What is deemed appropriate behaviour of a parent towards her child, for instance, differs from what is appropriate between business colleagues. In the same way, appropriate behaviour for an AI system depends upon whether that system is acting as a tutor, a health care provider, or a love interest.

Human morality is relationship-sensitive

Human relationships fulfil different functions. Some are grounded in care, such as that between parent and child or close friends. Others are more transactional, such as those between business associates. Still others may be aimed at securing a mate or the maintenance of social hierarchies.

These four functions — care, transaction, mating and hierarchy[4] — each solve different coordination challenges in relationships.

Care involves responding to others’ needs without keeping score — like one friend who helps another during difficult times. Transaction ensures fair exchanges where benefits are tracked and reciprocated — think of neighbours trading favours.

Photo of people on a crowded bus.
Our relationships with other people fulfil different basic functions – and observe different norms of behaviour. PintoArt / Shutterstock[5]

Mating governs romantic and sexual interactions, from casual dating to committed partnerships. And hierarchy structures interactions between people with different levels of authority over one another, enabling effective leadership and learning.

Every relationship type combines these functions differently, creating distinct patterns of expected behaviour. A parent–child relationship, for instance, is typically both caring and hierarchical (at least to some extent), and is generally expected not to be transactional — and definitely not to involve mating.

Research from our labs[6] shows that relational context does affect how people make moral judgements. An action may be deemed wrong[7] in one relationship but permissible, or even good, in another.

Of course, just because people are sensitive to relationship context when making moral judgements doesn’t meant they should be. Still, the very fact that they are is important to take into account in any discussion of AI ethics or design.

Relational AI

As AI systems take up more and more social roles in society, we need to ask: how does the relational context in which humans interact with AI systems impact ethical considerations?

When a chatbot insists upon changing the subject after its human interaction partner reports feeling depressed, the appropriateness of this action hinges in part on the relational context of the exchange.

If the chatbot is serving in the role of a friend or romantic partner, then clearly the response is inappropriate – it violates the relational norm of care, which is expected for such relationships. If, however, the chatbot is in the role of a tutor or business advisor, then perhaps such a response is reasonable or even professional.

Photo of a computer screen showing a popup window offering ChatGPT Plus subscriptions.
AI relationships generally have a transactional element that may sit uncomfortably with caring or other functions. Emiliano Vittoriosi / Unsplash[8]

It gets complicated, though. Most interactions with AI systems today occur in a commercial context – you have to pay to access the system (or engage with a limited free version that pushes you to upgrade to a paid version).

But in human relationships, friendship is something you don’t usually pay for. In fact, treating a friend in a “transactional” manner will often lead to hurt feelings.

When an AI simulates or serves in a care-based role, like friend or romantic partner, but ultimately the user knows she is paying a fee for this relational “service” — how will that affect her feelings and expectations? This is the sort of question we need to be asking[9].

What this means for AI designers, users and regulators

Regardless of whether one believes ethics should be relationship-sensitive, the fact most people act as if it is should be taken seriously in the design, use and regulation of AI.

Developers and designers of AI systems should consider not just abstract ethical questions (about sentience, for example), but relationship-specific ones.

Is a particular chatbot fulfilling relationship-appropriate functions? Is the mental health chatbot sufficiently responsive to the user’s needs? Is the tutor showing an appropriate balance of care, hierarchy and transaction?

Users of AI systems should be aware of potential vulnerabilities tied to AI use in particular relational contexts. Becoming emotionally dependent upon a chatbot in a caring context, for example, could be bad news if the AI system cannot sufficiently deliver on the caring function.

Regulatory bodies would also do well to consider relational contexts when developing governance structures. Instead of adopting broad, domain-based risk assessments (such as deeming AI use in education “high risk”), regulatory agencies might consider more specific relational contexts and functions in adjusting risk assessments and developing guidelines.

As AI becomes more embedded in our social fabric, we need nuanced frameworks that recognise the unique nature of human-AI relationships. By thinking carefully about what we expect from different types of relationships — whether with humans or AI — we can help ensure these technologies enhance rather than diminish our lives.

References

  1. ^ romantic partners (www.nytimes.com)
  2. ^ we argue (arxiv.org)
  3. ^ shape our judgements (www.nature.com)
  4. ^ care, transaction, mating and hierarchy (www.researchgate.net)
  5. ^ PintoArt / Shutterstock (www.shutterstock.com)
  6. ^ Research from our labs (escholarship.org)
  7. ^ action may be deemed wrong (www.nature.com)
  8. ^ Emiliano Vittoriosi / Unsplash (unsplash.com)
  9. ^ we need to be asking (link.springer.com)

Read more https://theconversation.com/friend-tutor-doctor-lover-why-ai-systems-need-different-rules-for-different-roles-252302

The Times Features

Optimal Locations for Smoke Alarms in Australian Homes

Smoke alarms play a crucial role in ensuring the safety of homes across Australia. They are essential in alerting occupants at the earliest signs of a fire, allowing enough time ...

10 Smart Ways Australians Can Slash Their Electricity Bills in 2025

Electricity prices in Australia continue to rise, but that does not mean you have to sacrifice your lifestyle to save money. By making a few smart changes, you can lower your pow...

Trusted Healthcare Construction Company for Modern Facilities

Achieving quality, safety, and innovative medical facilities is challenging in an ever-changing healthcare world without collaboration with a trusted healthcare construction comp...

How to Treat Hair Loss Without a Hair Transplant

Understanding Hair Loss Hair loss can significantly affect individuals, both physically and emotionally. Identifying the causes and types can help address the issue more effecti...

How to Find a Trustworthy Professional for Your Plumbing Needs

Nowra is an idyllic locality often referred to as the city of the Shoalhaven City Council in the South Coast region of New South Wales, Australia. This picturesque suburb feature...

How to Choose a Mattress for Back/Neck Pain and All Sleepers?

Waking up with a stiff neck or aching back can derail your entire day. If you're one of the millions struggling with chronic pain, a supportive mattress is more than a luxury – i...

Times Magazine

The Essential Guide to Transforming Office Spaces for Maximum Efficiency

Why Office Fitouts MatterA well-designed office can make all the difference in productivity, employee satisfaction, and client impressions. Businesses of all sizes are investing in updated office spaces to create environments that foster collaborat...

The A/B Testing Revolution: How AI Optimized Landing Pages Without Human Input

A/B testing was always integral to the web-based marketing world. Was there a button that converted better? Marketing could pit one against the other and see which option worked better. This was always through human observation, and over time, as d...

Using Countdown Timers in Email: Do They Really Increase Conversions?

In a world that's always on, where marketers are attempting to entice a subscriber and get them to convert on the same screen with one email, the power of urgency is sometimes the essential element needed. One of the most popular ways to create urg...

Types of Software Consultants

In today's technology-driven world, businesses often seek the expertise of software consultants to navigate complex software needs. There are several types of software consultants, including solution architects, project managers, and user experienc...

CWU Assistive Tech Hub is Changing Lives: Win a Free Rollator Walker This Easter!

🌟 Mobility. Independence. Community. All in One. This Easter, the CWU Assistive Tech Hub is pleased to support the Banyule community by giving away a rollator walker. The giveaway will take place during the Macleod Village Easter Egg Hunt & Ma...

"Eternal Nurture" by Cara Barilla: A Timeless Collection of Wisdom and Healing

Renowned Sydney-born author and educator Cara Barilla has released her latest book, Eternal Nurture, a profound collection of inspirational quotes designed to support mindfulness, emotional healing, and personal growth. With a deep commitment to ...

LayBy Shopping