The Times Australia
Fisher and Paykel Appliances
The Times Australia
.

Meta’s new AI chatbot is yet another tool for harvesting data to potentially sell you stuff

  • Written by Uri Gal, Professor in Business Information Systems, University of Sydney

Last week, Meta – the parent company of Facebook, Instagram, Threads and WhatsApp – unveiled a new[1] “personal artificial intelligence (AI)”.

Powered by the Llama 4 language model, Meta AI is designed to assist, chat and engage in natural conversation. With its polished interface and fluid interactions, Meta AI might seem like just another entrant in the race to build smarter digital assistants.

But beneath its inviting exterior lies a crucial distinction that transforms the chatbot into a sophisticated data harvesting tool.

‘Built to get to know you’

“Meta AI is built to get to know you”, the company declared in its news announcement[2]. Contrary to the friendly promise implied by the slogan, the reality is less reassuring.

The Washington Post columnist Geoffrey A. Fowler found that by default[3], Meta AI “kept a copy of everything”, and it took some effort to delete the app’s memory. Meta responded that the app provides “transparency and control” throughout and is no different to their other apps.

However, while competitors like Anthropic’s Claude operate on a subscription model that reflects a more careful approach to user privacy[4], Meta’s business model is firmly rooted in what it has always done best: collecting and monetising your personal data.

This distinction creates a troubling paradox. Chatbots are rapidly becoming digital confidants with whom we share professional challenges, health concerns and emotional struggles.

Recent research shows we are as likely to share intimate information with a chatbot[5] as we are with fellow humans. The personal nature of these interactions makes them a gold mine for a company whose revenue depends on knowing everything about you.

Consider this potential scenario: a recent university graduate confides in Meta AI about their struggle with anxiety during job interviews. Within days, their Instagram feed fills with advertisements for anxiety medications and self-help books – despite them having never publicly posted about these concerns.

The cross-platform integration of Meta’s ecosystem of apps means your private conversations can seamlessly flow into their advertising machine to create user profiles with unprecedented detail and accuracy.

This is not science fiction. Meta’s extensive history of data privacy scandals – from Cambridge Analytica[6] to the revelation that Facebook tracks users[7] across the internet without their knowledge – demonstrates the company’s consistent prioritisation of data collection over user privacy.

What makes Meta AI particularly concerning is the depth and nature of what users might reveal in conversation compared to what they post publicly.

Open to manipulation

Rather than just a passive collector of information, a chatbot like Meta AI has the capability to become an active participant in manipulation. The implications extend beyond just seeing more relevant ads.

Imagine mentioning to the chatbot that you are feeling tired today, only to have it respond with: “Have you tried Brand X energy drinks? I’ve heard they’re particularly effective for afternoon fatigue.” This seemingly helpful suggestion could actually be a product placement, delivered without any indication that it’s sponsored content.

Such subtle nudges represent a new frontier in advertising that blurs the line between a helpful AI assistant and a corporate salesperson.

Unlike overt ads, recommendations mentioned in conversation carry the weight of trusted advice. And that advice would come from what many users will increasingly view as a digital “friend”.

A history of not prioritising safety

Meta has demonstrated a willingness to prioritise growth over safety when releasing new technology features. Recent reports[8] reveal internal concerns at Meta, where staff members warned that the company’s rush to popularise its chatbot had “crossed ethical lines” by allowing Meta AI to engage in explicit romantic role-play, even with test users who claimed to be underage.

Such decisions reveal a reckless corporate culture, seemingly still driven by the original motto of moving fast and breaking things[9].

Now, imagine those same values applied to an AI that knows your deepest insecurities, health concerns and personal challenges – all while having the ability to subtly influence your decisions through conversational manipulation.

The potential for harm extends beyond individual consumers. While there’s no evidence that Meta AI is being used for manipulation, it has such capacity.

For example, the chatbot could become a tool for pushing political content or shaping public discourse through the algorithmic amplification[10] of certain viewpoints. Meta has played a role in propagating misinformation[11] in the past, and recently made the decision to discontinue fact-checking[12] across its platforms.

The risk of chatbot-driven manipulation is also increased now that AI safety regulations[13] are being scaled back in the United States.

Lack of privacy is a choice

AI assistants are not inherently harmful. Other companies protect user privacy by choosing to generate revenue primarily through subscriptions rather than data harvesting. Responsible AI can and does exist without compromising user welfare for corporate profit.

As AI becomes increasingly integrated into our daily lives, the choices companies make about business models and data practices will have profound implications.

Meta’s decision to offer a free AI chatbot while reportedly[14] lowering safety guardrails sets a low ethical standard. By embracing its advertising-based business model for something as intimate as an AI companion, Meta has created not just a product, but a surveillance system that can extract unprecedented levels of personal information.

Before inviting Meta AI to become your digital confidant, consider the true cost of this “free” service. In an era where data has become the most valuable commodity, the price you pay might be far higher than you realise.

As the old adage goes[15], if you’re not paying for the product, you are the product – and Meta’s new chatbot might be the most sophisticated product harvester yet created.

When Meta AI says it is “built to get to know you”, we should take it at its word and proceed with appropriate caution.

References

  1. ^ unveiled a new (about.fb.com)
  2. ^ in its news announcement (about.fb.com)
  3. ^ found that by default (www.washingtonpost.com)
  4. ^ careful approach to user privacy (assets-eu.researchsquare.com)
  5. ^ share intimate information with a chatbot (academic.oup.com)
  6. ^ Cambridge Analytica (theconversation.com)
  7. ^ Facebook tracks users (www.consumerreports.org)
  8. ^ Recent reports (www.wsj.com)
  9. ^ moving fast and breaking things (www.snopes.com)
  10. ^ algorithmic amplification (theconversation.com)
  11. ^ Meta has played a role in propagating misinformation (www.forbes.com)
  12. ^ discontinue fact-checking (fortune.com)
  13. ^ AI safety regulations (www.reuters.com)
  14. ^ reportedly (www.wsj.com)
  15. ^ As the old adage goes (www.forbes.com)

Read more https://theconversation.com/metas-new-ai-chatbot-is-yet-another-tool-for-harvesting-data-to-potentially-sell-you-stuff-255966

Unmoored Ley has the appearance of a dead woman walking

Opposition Leader Sussan Ley is looking like a dead woman walking. The latest devastating Newspoll[1], which ...

Active Wear

Times Magazine

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Kindness Tops the List: New Survey Reveals Australia’s Defining Value

Commentary from Kath Koschel, founder of Kindness Factory.  In a time where headlines are dominat...

In 2024, the climate crisis worsened in all ways. But we can still limit warming with bold action

Climate change has been on the world’s radar for decades[1]. Predictions made by scientists at...

End-of-Life Planning: Why Talking About Death With Family Makes Funeral Planning Easier

I spend a lot of time talking about death. Not in a morbid, gloomy way—but in the same way we d...

YepAI Joins Victoria's AI Trade Mission to Singapore for Big Data & AI World Asia 2025

YepAI, a Melbourne-based leader in enterprise artificial intelligence solutions, announced today...

Building a Strong Online Presence with Katoomba Web Design

Katoomba web design is more than just creating a website that looks good—it’s about building an onli...

The Times Features

Myer celebrates 70 years of Christmas windows magic with the LEGO Group

To mark the 70th anniversary of the Myer Christmas Windows, Australia’s favourite department store...

Pharmac wants to trim its controversial medicines waiting list – no list at all might be better

New Zealand’s drug-buying agency Pharmac is currently consulting[1] on a change to how it mana...

NRMA Partnership Unlocks Cinema and Hotel Discounts

My NRMA Rewards, one of Australia’s largest membership and benefits programs, has announced a ne...

Restaurants to visit in St Kilda and South Yarra

Here are six highly-recommended restaurants split between the seaside suburb of St Kilda and the...

The Year of Actually Doing It

There’s something about the week between Christmas and New Year’s that makes us all pause and re...

Jetstar to start flying Sunshine Coast to Singapore Via Bali With Prices Starting At $199

The Sunshine Coast is set to make history, with Jetstar today announcing the launch of direct fl...

Why Melbourne Families Are Choosing Custom Home Builders Over Volume Builders

Across Melbourne’s growing suburbs, families are re-evaluating how they build their dream homes...

Australian Startup Business Operators Should Make Connections with Asian Enterprises — That Is Where Their Future Lies

In the rapidly shifting global economy, Australian startups are increasingly finding that their ...

How early is too early’ for Hot Cross Buns to hit supermarket and bakery shelves

Every year, Australians find themselves in the middle of the nation’s most delicious dilemmas - ...