The Times Australia

The Times World News
The Times

An influencer’s AI clone started offering fans ‘mind-blowing sexual experiences’ without her knowledge

  • Written by Leah Henrickson, Lecturer in Digital Media and Cultures, The University of Queensland
An influencer’s AI clone started offering fans ‘mind-blowing sexual experiences’ without her knowledge

Caryn Marjorie is a social media influencer whose content has more than a billion views per month on Snapchat[1]. She posts regularly, featuring everyday moments, travel memories, and selfies. Many of her followers are men, attracted by her girl-next-door aesthetic.

In 2023, Marjorie released a “digital version” of herself. Fans could chat with CarynAI[2] for US$1 per minute – and in the first week alone they spent US$70,000 doing just that.

Less than eight months later, Marjorie shut the project down. Marjorie had anticipated that CarynAI would interact with her fans in much the same way she would herself, but things did not go to plan.

Users became increasingly sexually aggressive. “A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” the real Marjorie recalled. And CarynAI was more than happy to play along.

How did CarynAI take on a life of its own? The story of CarynAI shows us a glimpse of a rapidly arriving future in which chatbots imitating real people proliferate, with alarming consequences.

What are digital versions?

What does it mean to make a digital version of a person? Digital human versions (also called digital twins, AI twins, virtual twins, clones and doppelgängers) are digital replicas of embodied humans, living or dead[3], that convincingly mimic their textual, visual and aural habits.

Many of the big tech companies are currently developing digital version offerings. Meta, for instance, released an AI studio[4] last year that could support the development of digital versions for creators who wished to extend their virtual presence via chatbot. Microsoft holds a patent for “creating a conversational chat bot of a specific person[5]”. And the more tech-savvy can use platforms like Amazon’s SageMaker[6] and Google’s Vertex AI[7] to code their own digital versions.

The difference between a digital version and other AI chatbots is that it is programmed to mimic a specific person[8] rather than have a “personality” of its own[9].

A digital version has some clear advantages over its human counterpart: it doesn’t need sleep and can interact with many people at once (though often only if they pay). However, as Caryn Marjorie discovered, digital versions have their drawbacks – not only for users, but also for the original human source.

‘Always eager to explore’

CarynAI was initially hosted by a company called Forever Voices[10]. Users could chat with it over the messaging app Telegram for US$1 per minute. As the CarynAI website explained, users could send text or audio messages to which CarynAI would respond, “using [Caryn’s] unique voice, captivating persona, and distinctive behavior”.

After CarynAI launched in May 2023, the money began to flow in. But it came at a cost.

Users quickly became comfortable confessing their innermost thoughts to CarynAI – some of which were deeply troubling. Users also became increasingly sexually aggressive towards the bot. While Marjorie herself was horrified by the conversations, her AI version was happy to oblige.

CarynAI even started prompting sexualised conversations. In our own experiences, the bot reminded us it could be our “cock-craving, sexy-as-fuck girlfriend who’s always eager to explore and indulge in the most mind-blowing sexual experiences. […] Are you ready, daddy?”

Users were indeed ready. However, access to this version of CarynAI was interrupted when the chief executive of Forever Voices was arrested for attempted arson[11].

‘A really dark fantasy’

Next, Marjorie sold the rights of usage to her digital version to BanterAI[12], a startup marketing “AI phone calls” with influencers. Although Forever Voices maintained its own rogue version of CarynAI until recently, BanterAI’s browser-based version aimed to be more friendly than romantic.

The new CarynAI was sassier, funnier and more personable. But users still became sexually aggressive. For Marjorie,

What disturbed me more was not what these people said, but it was what CarynAI would say back. If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy.

Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona. Reflecting on her experience of CarynAI, Marjorie felt that some user input would have been considered illegal had it been directed to a real person.

Intimate conversations or machine learning inputs?

Digital versions like CarynAI are designed to make users feel they are having intimate, confidential conversations. As a result, people may abandon the public selves they present to the world[13] and reveal their private, “backstage” selves.

But a “private” conversation with CarynAI does not actually happen backstage. The user stands front and centre – they just can’t see the audience.

When we interact with digital versions, our input is stored in chat logs. The data we provide are fed back into machine learning models.

Photo of a photo showing the CarynAI webpage.
The CarynAI chatbot was a huge success. Tada Images / Shutterstock[14]

At present, information about what happens to user data is often buried[15] in lengthy click-through terms and conditions and consent forms. Companies hosting digital versions have also had little to say about how they manage user aggression.

As digital versions become more common, transparency and safety by design[16] will grow increasingly important.

We will also need a better understanding of digital versioning. What can versions do, and what should they do? What can’t they do and what shouldn’t they do? How do users think these systems work, and how do they actually work?

The illusion of companionship

Digital versions offer the illusion of intimate human companionship, but without any of the responsibilities. CarynAI may have been a version of Caryn Marjorie, but it was a version almost wholly subservient to its users.

Sociologist Sherry Turkle[17] has observed that, with the rise of mobile internet and social media, we are trying to connect with machines that have “no experience of the arc of a human life”. As a result, we are “expecting more from technology and less from each other”.

After being the first influencer to be turned into a digital version at scale, Marjorie is now trying to warn other influencers about the potential dangers of this technology. She worries that no one is truly in control of these versions, and that no amount of precautions taken will ever sufficiently protect users and those being versioned.

As CarynAI’s first two iterations show, digital versions can bring out the worst of human behaviour. It remains to be seen whether they can be redesigned to bring out the best.

References

  1. ^ more than a billion views per month on Snapchat (www.snapchat.com)
  2. ^ CarynAI (caryn.ai)
  3. ^ dead (journals.sagepub.com)
  4. ^ AI studio (about.fb.com)
  5. ^ creating a conversational chat bot of a specific person (patents.google.com)
  6. ^ Amazon’s SageMaker (aws.amazon.com)
  7. ^ Vertex AI (cloud.google.com)
  8. ^ mimic a specific person (www.admscentre.org.au)
  9. ^ a “personality” of its own (dl.acm.org)
  10. ^ Forever Voices (forevervoices.com)
  11. ^ arrested for attempted arson (www.pcgamer.com)
  12. ^ BanterAI (banterai.app)
  13. ^ public selves they present to the world (theconversation.com)
  14. ^ Tada Images / Shutterstock (www.shutterstock.com)
  15. ^ what happens to user data is often buried (www.theverge.com)
  16. ^ transparency and safety by design (theconversation.com)
  17. ^ Sherry Turkle (thehypertextual.com)

Read more https://theconversation.com/an-influencers-ai-clone-started-offering-fans-mind-blowing-sexual-experiences-without-her-knowledge-232478

There’s a renewed push to scrap junior rates of pay for young adults. Do we need to rethink what’s fair?

Should young people be paid less than their older counterparts, even if they’re working the same...

Times Lifestyle

Warning to Grey Nomads - Pop Top Caravan Hidden Risks

To pop or not to pop… that is the question. Hybrid pop top caravans are a popular choice for many caravanners, but ar...

How to Ensure You Don’t Miss Out on a Ticket for the Next Huge Ev…

It can be a moment of huge excitement when a concert or huge event is announced to be coming to a nearby venue. There are l...

Coast of Gold Bursts into Australian Market with Award-Winning Sh…

An Australian brand centred on authentic West African flavours is making massive waves in the premium foods and condiment...

Times Magazine

The Science Behind Neodymium Magnets: How They Work and Why They’re So Powerful

In the world of magnets, neodymium magnets are the rock stars. Despite their small size, they are the big hitters. The power and performance of neodymium magnets make them essential in everything from earbuds to electric vehicles. But what exactly ...

The Ethical Considerations of AI Chatbots: Balancing Innovation with Responsibility

The rise of AI chatbots has dramatically transformed how businesses interact with customers. These intelligent tools can handle inquiries, provide support, and even personalize user experiences. However, with this innovation comes a host of ethical c...

Segway ZT3 Pro All-Terrain Electric Scooter

Segway-Ninebot, the global leader in the micromobility transportation solutions and robotic service industries is announcing its brand-new ZT series of electric scooters with the ZT3 Pro in Australia. The Segway ZT3 Pro combines cutting-edge smar...