The Times Australia
Fisher and Paykel Appliances
The Times World News

.

Disinformation is spreading beyond the realm of spycraft to become a shady industry – lessons from South Korea

  • Written by K. Hazel Kwon, Associate Professor of Journalism and Digital Audiences, Arizona State University
Disinformation is spreading beyond the realm of spycraft to become a shady industry – lessons from South Korea

Disinformation, the practice of blending real and fake information with the goal of duping a government or influencing public opinion, has its origins in the Soviet Union. But disinformation is no longer the exclusive domain of government intelligence agencies.

Today’s disinformation scene has evolved into a marketplace in which services are contracted, laborers are paid and shameless opinions and fake readers are bought and sold. This industry is emerging around the world. Some of the private-sector players are driven by political motives, some by profit and others by a mix of the two.

Public relations firms have recruited social media influencers in France and Germany[1] to spread falsehoods. Politicians have hired staff to create fake Facebook accounts in Honduras[2]. And Kenyan Twitter influencers[3] are paid 15 times more than many people make in a day for promoting political hashtags. Researchers at the University of Oxford have tracked government-sponsored disinformation activities in 81 countries and private-sector disinformation operations in 48 countries[4].

South Korea has been at the forefront of online disinformation. Western societies began to raise concerns about disinformation in 2016, triggered by disinformation related to the 2016 U.S. presidential election and Brexit. But in South Korea, media reported the first formal disinformation operation in 2008. As a researcher who studies digital audiences[5], I’ve found that South Korea’s 13-year-long disinformation history demonstrates how technology, economics and culture interact to enable the disinformation industry.

Most importantly, South Korea’s experience offers a lesson for the U.S. and other countries. The ultimate power of disinformation is found more in the ideas and memories that a given society is vulnerable to and how prone it is to fueling the rumor mill than it is in the people perpetrating the disinformation or the techniques they use.

From dirty politics to dirty business

The origin of South Korean disinformation can be traced back to the nation’s National Intelligence Service, which is equivalent to the U.S. Central Intelligence Agency. The NIS formed teams in 2010 to interfere in domestic elections[6] by attacking a political candidate it opposed.

The NIS hired more than 70 full-time workers who managed fake, or so-called sock puppet[7], accounts. The agency recruited a group called Team Alpha, which was composed of civilian part-timers who had ideological and financial interests in working for the NIS. By 2012, the scale of the operation had grown to 3,500 part-time workers[8].

Two men, one in a suit jacket in the other a windbreaker jacket, stand shoulder to shoulder in a stairwell, photographers behind them
South Korean President Moon Jae-in (left) campaigning in 2014 for Kim Kyoung-soo (right), who became governor of South Gyeongsang Province in 2018 but was subsequently convicted of opinion rigging. Udenjan/WikiCommons, CC BY[9][10]

Since then the private sector has moved into the disinformation business. For example, a shadowy publishing company led by an influential blogger was involved in a high-profile opinion-rigging scandal[11] between 2016 and 2018. The company’s client was a close political aide of the current president, Moon Jae-in.

In contrast to NIS-driven disinformation campaigns, which use disinformation as a propaganda tool for the government, some of the private-sector players are chameleonlike, changing ideological and topical positions in pursuit of their business interests. These private-sector operations have achieved greater cost effectiveness than government operations by skillfully using bots to amplify fake engagements[12], involving social media entrepreneurs like YouTubers[13] and outsourcing trolling to cheap laborers[14].

Narratives that strike a nerve

In South Korea, Cold War rhetoric has been particularly visible across all types of disinformation operations. The campaigns typically portray the conflict with North Korea and the battle against Communism as being at the center of public discourse in South Korea. In reality, nationwide polls have painted a very different picture. For example, even when North Korea’s nuclear threat was at a peak in 2017, fewer than 10 percent of respondents[15] picked North Korea’s saber-rattling as their priority concern, compared with more than 45 percent who selected economic policy.

Across all types of purveyors and techniques, political disinformation in South Korea has amplified anti-Communist nationalism and denigrated the nation’s dovish diplomacy toward North Korea. My research on South Korean social media rumors[16] in 2013 showed that the disinformation rhetoric continued on social media even after the formal disinformation campaign ended, which indicates how powerful these themes are. Today I and my research team continue to see references to the same themes.

A man standing on a stage while holding a microphone tears a flag Much of the disinformation trafficked in South Korea involves nationalistic anti-Communist narratives similar to this protester’s anti-North Korea message. Photo by Jung Yeon-je/AFP via Getty Images[17]

The dangers of a disinformation industry

The disinformation industry is enabled by the three prongs of today’s digital media industry: an attention economy, algorithm and computational technologies and a participatory culture. In online media, the most important currency is audience attention. Metrics such as the number of page views, likes, shares and comments quantify attention, which is then converted into economic and social capital.

Ideally, these metrics should be a product of networked users’ spontaneous and voluntary participation. Disinformation operations more often than not manufacture these metrics by using bots, hiring influencers, paying for crowdsourcing and developing computational tricks to game a platform’s algorithms.

The expansion of the disinformation industry is troubling because it distorts how public opinion is perceived by researchers, the media and the public itself. Historically, democracies have relied on polls to understand public opinion. Despite their limitations, nationwide polls conducted by credible organizations, such as Gallup[18] and Pew Research[19], follow rigorous methodological standards to represent the distribution of opinions in society in as representative a manner as possible.

Public discourse on social media has emerged as an alternative means of assessing public opinion. Digital audience and web traffic analytic tools are widely available to measure the trends of online discourse. However, people can be misled when purveyors of disinformation manufacturer opinions expressed online and falsely amplify the metrics about the opinions.

Meanwhile, the persistence of anti-Communist nationalist narratives in South Korea shows that disinformation purveyors’ rhetorical choices are not random. To counter the disinformation industry wherever it emerges, governments, media and the public need to understand not just the who and the how, but also the what – a society’s controversial ideologies and collective memories. These are the most valuable currency in the disinformation marketplace.

[The Conversation’s science, health and technology editors pick their favorite stories. Weekly on Wednesdays[20].]

References

  1. ^ France and Germany (www.nytimes.com)
  2. ^ Honduras (www.theguardian.com)
  3. ^ Kenyan Twitter influencers (www.wired.com)
  4. ^ private-sector disinformation operations in 48 countries (demtech.oii.ox.ac.uk)
  5. ^ studies digital audiences (scholar.google.com)
  6. ^ to interfere in domestic elections (www.theguardian.com)
  7. ^ sock puppet (doi.org)
  8. ^ 3,500 part-time workers (www.brookings.edu)
  9. ^ Udenjan/WikiCommons (commons.wikimedia.org)
  10. ^ CC BY (creativecommons.org)
  11. ^ opinion-rigging scandal (www.koreaherald.com)
  12. ^ using bots to amplify fake engagements (ojs.aaai.org)
  13. ^ YouTubers (restofworld.org)
  14. ^ outsourcing trolling to cheap laborers (globalvoices.org)
  15. ^ fewer than 10 percent of respondents (www.nytimes.com)
  16. ^ South Korean social media rumors (doi.org)
  17. ^ Photo by Jung Yeon-je/AFP via Getty Images (www.gettyimages.com)
  18. ^ Gallup (www.gallup.com)
  19. ^ Pew Research (www.pewresearch.org)
  20. ^ Weekly on Wednesdays (theconversation.com)

Read more https://theconversation.com/disinformation-is-spreading-beyond-the-realm-of-spycraft-to-become-a-shady-industry-lessons-from-south-korea-168054

Times Magazine

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

The Times Features

The rise of chatbot therapists: Why AI cannot replace human care

Some are dubbing AI as the fourth industrial revolution, with the sweeping changes it is propellin...

Australians Can Now Experience The World of Wicked Across Universal Studios Singapore and Resorts World Sentosa

This holiday season, Resorts World Sentosa (RWS), in partnership with Universal Pictures, Sentosa ...

Mineral vs chemical sunscreens? Science shows the difference is smaller than you think

“Mineral-only” sunscreens are making huge inroads[1] into the sunscreen market, driven by fears of “...

Here’s what new debt-to-income home loan caps mean for banks and borrowers

For the first time ever, the Australian banking regulator has announced it will impose new debt-...

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...