The Times Australia

The Times World News
The Times

An AI-driven influence operation is spreading pro-China propaganda across YouTube

  • Written by David Tuffley, Senior Lecturer in Applied Ethics & CyberSecurity, Griffith University
An AI-driven influence operation is spreading pro-China propaganda across YouTube

A recent investigation from the Australian Strategic Policy Institute (ASPI) has revealed[1] an extensive network of YouTube channels promoting pro-Chinese and anti-US public opinion in the English-speaking world.

The operation is well-coordinated, using generative AI[2] to rapidly produce and publish content, while deftly exploiting YouTube’s algorithmic recommendation system.

How big is the network?

Operation “Shadow Play[3]” involves a network of at least 30 YouTube channels with about 730,000 subscribers. At the time of writing this article the channels had some 4,500 videos between them, with about 120 million views.

According to ASPI[4], the channels gained audiences by using AI algorithms to cross-promote each other’s content, thereby boosting visibility. This is concerning as it allows state messaging to cross borders with plausible deniability[5].

The network of videos also featured an AI avatar created by British artificial intelligence company Synthesia, according to the report[6], as well as other AI-generated entities and voiceovers.

While it’s not clear who is behind the operation, investigators say the controller is likely Mandarin-speaking. After profiling the behaviour, they concluded it doesn’t match that of any known state actor in the business of online influence operations. Instead, they suggest it might be a commercial entity operating under some degree of state direction.

These findings double as the latest evidence that advanced influence operations are evolving faster than defensive measures.

Influencer conflicts of interest

One clear parallel between the Shadow Play operation and other influence campaigns is the use of coordinated networks of inauthentic social media accounts, and pages amplifying the messaging.

For example, in 2020 Facebook took down[7] a network of more than 300 Facebook accounts, pages and Instagram accounts that were being run from China and posting content about the US election and COVID pandemic. As was the case with Shadow Play, these assets worked together to spread content and make it appear more popular than it was.

Read more: Scams, deepfake porn and romance bots: advanced AI is exciting, but incredibly dangerous in criminals' hands[8]

Is current legislation strong enough?

The current disclosure requirements around sponsored content have some glaring gaps when it comes to addressing cross-border influence campaigns. Most Australian consumer protection[9] and advertising regulation[10] focuses on commercial sponsorships rather than geopolitical conflicts of interest.

Platforms such as YouTube prohibit[11] deceptive practices in their stated rules. However, identifying and enforcing violations is difficult with foreign state-affiliated accounts that conceal who is pulling their strings.

Determining what is propaganda, as opposed to free speech, raises difficult ethical questions around censorship[12] and political opinions. Ideally, transparency measures shouldn’t unduly restrict protected[13] speech. But viewers still deserve to understand an influencer’s incentives and potential biases.

Possible measures could include clear disclosures when content is affiliated directly or indirectly with a foreign government, as well as making affiliation and location data more visible on channels.

How to spot deceptive content?

As technologies become more sophisticated, it’s becoming harder to discern what agenda or conflict of interest may be shaping the content of a video.

Discerning viewers can gain some insight by looking into the creator(s) behind the content. Do they provide information on who they are, where they’re based and their background? A lack of clarity may signal an attempt to obscure their identity.

You can also assess the tone and goal of the content. Does it seem to be driven by a specific ideological argument? What is the poster’s ultimate aim: are they just trying to get clicks, or are they persuading you into believing their viewpoint?

Check for credibility signals, such as what other established sources say about this creator or their claims. When something seems dubious, rely on authoritative journalists and fact-checkers.

And make sure not to consume too much content from any single creator. Get your information from reliable sources across the political spectrum so you can take an informed stance.

The bigger picture

The advancement of AI could exponentially amplify[14] the reach and precision of coordinated influence operations if ethical safeguards aren’t implemented. At its most extreme, the unrestricted spread of AI propaganda[15] could undermine truth and manipulate real-world events.

Propaganda campaigns may not stop at trying to shape narratives and opinions. They could also be used to generate[16] hyper-realistic[17] text, audio and image content aimed at radicalising individuals. This could greatly destabilise our societies.

We’re already seeing the precursors[18] of what could become AI psy-ops[19] with the ability to spoof identities, surveil citizens en masse, and automate disinformation production.

Without applying an ethics or oversight framework[20] to content moderation[21] and recommendation algorithms, social platforms could effectively act as misinformation mega-amplifiers optimised for watch-time, regardless of the consequences.

Over time, this may erode social cohesion, upend elections, incite violence and even undermine[22] our democratic institutions. And unless we move quickly, the pace of malicious innovation may outstrip[23] any regulatory measures.

It’s more important than ever to establish external oversight[24] to make sure social media platforms work for the greater good, and not just short-term profit.

Read more: Facebook's algorithms fueled massive foreign propaganda campaigns during the 2020 election – here's how algorithms can manipulate you[25]

References

  1. ^ has revealed (www.aspi.org.au)
  2. ^ generative AI (www.techopedia.com)
  3. ^ Shadow Play (www.aspi.org.au)
  4. ^ ASPI (ad-aspi.s3.ap-southeast-2.amazonaws.com)
  5. ^ plausible deniability (www.cybersecurityintelligence.com)
  6. ^ the report (ad-aspi.s3.ap-southeast-2.amazonaws.com)
  7. ^ took down (about.fb.com)
  8. ^ Scams, deepfake porn and romance bots: advanced AI is exciting, but incredibly dangerous in criminals' hands (theconversation.com)
  9. ^ consumer protection (legalvision.com.au)
  10. ^ advertising regulation (www.accc.gov.au)
  11. ^ prohibit (support.google.com)
  12. ^ censorship (news.columbia.edu)
  13. ^ protected (www.abc.net.au)
  14. ^ exponentially amplify (www.technologyreview.com)
  15. ^ AI propaganda (www.govtech.com)
  16. ^ generate (www.technologyreview.com)
  17. ^ hyper-realistic (www.cambridge.org)
  18. ^ precursors (www.cambridge.org)
  19. ^ AI psy-ops (www.apa.org)
  20. ^ oversight framework (www.cambridge.org)
  21. ^ content moderation (cssh.northeastern.edu)
  22. ^ undermine (il.boell.org)
  23. ^ outstrip (www.mckinsey.com)
  24. ^ establish external oversight (ctb.ku.edu)
  25. ^ Facebook's algorithms fueled massive foreign propaganda campaigns during the 2020 election – here's how algorithms can manipulate you (theconversation.com)

Read more https://theconversation.com/an-ai-driven-influence-operation-is-spreading-pro-china-propaganda-across-youtube-219962

Rebuilding homes after a disaster is an opportunity to build back better – why isn’t the insurance industry on board?

For many Australians, 2022 was a dark and devastating year. Major floods wreaked havoc on hundreds...

Times Lifestyle

Korean cuisine Gami Chicken poised to become $60 million business

Fast-casual dining chain continues to delight diners with its menu innovations and healthier dishes As Australians conti...

Purple Wiggle John Pearce goes Green with new food waste initiati…

Australia is in the midst of a food-waste crisis with new research revealing our bad habits are placing future generation...

Christmas gifting made easy | Once Upon Photobooks

Unique Christmas gift ideas are hard to come by - Once Upon photobooks are changing that.  Modernising the way we collec...

Times Magazine

Creating a Healthier Coop with Natural Bedding

Choose the right bedding this is the first step to providing a healthy atmosphere for your hens. Natural bedding materials promote improved air quality with minimal smells, disease prevention, and more. Organically and biodegradable chicken beddi...

The Benefits of Collaborative Family Law for Amicable Resolutions

Looking to resolve their disputes outside of court often find themselves exploring various options to reach a peaceful resolution. Whether it involves co-parenting arrangements, financial settlements, or future planning, there are methods designe...

How OEM Navigation Systems Improve Trucking Safety Standards

OEM navigation systems for trucks have become essential tools for modern trucking operations. These systems are integrated directly into vehicles, offering precise navigation and real-time updates. Unlike aftermarket solutions, OEM systems are design...