Google AI
The Times Australia
The Times World News

.

Can ideology-detecting algorithms catch online extremism before it takes hold?

  • Written by Rohit Ram, PhD Student, Social Data Science, University of Technology Sydney
Can ideology-detecting algorithms catch online extremism before it takes hold?

Rohit Ram receives funding from the Defence Science and Technology Group (DSTG) and was supported by an Australian Government Research Training Program (RTP) Scholarship. Ideology has always been a critical element in understanding how we view the world, form opinions and make political decisions.

However, the internet has revolutionised the way opinions and ideologies spread, leading to new forms of online radicalisation. Far-right ideologies, which advocate for ultra-nationalism, racism and opposition to immigration and multiculturalism, have proliferated on social platforms.

These ideologies have strong links with violence and terrorism. In recent years, as much as 40%[1] of the caseload of the Australian Security Intelligence Organisation (ASIO) was related to far-right extremism. This has declined[2], though, with the easing of COVID restrictions.

Detecting online radicalisation early could help prevent far-right ideology-motivated (and potentially violent) activity. To this end, we have developed a completely automatic system[3] that can determine the ideology of social media users based on what they do online.

How it works

Our proposed pipeline is based on detecting the signals of ideology from people’s online behaviour.

There is no way to directly observe a person’s ideology. However, researchers can observe “ideological proxies” such as the use of political hashtags, retweeting politicians and following political parties.

Read more: How far-right online spaces use mainstream media to spread their ideology[4]

But using ideological proxies requires a lot of work: you need experts to understand and label the relationships between proxies and ideology. This can be expensive and time-consuming.

What’s more, online behaviour and contexts change between countries and social platforms. They also shift rapidly over time. This means even more work to keep your ideological proxies up to date and relevant.

You are what you post

Our pipeline simplifies this process and makes it automatic. It has two main components: a “media proxy”, which determines ideology via links to media, and an “inference architecture”, which helps us determine the ideology of people who don’t post links to media.

The media proxy measures the ideological leaning of an account by tracking which media sites it posts links to. Posting links to Fox News would indicate someone is more likely to lean right, for example, while linking to the Guardian indicates a leftward tendency.

To categorise the media sites users link to, we took the left-right ratings for a wide range of news sites from two datasets (though many are available). One was based on a Reuters survey[5] and the other curated by experts at Allsides.com[6].

This works well for people who post links to media sites. However, most people don’t do that very often. So what do we do about them?

Read more: COVID wasn't a 'bumper campaign' for right-wing extremists. But the threat from terror remains[7]

That’s where the inference architecture comes in. In our pipeline, we determine how ideologically similar people are to one another with three measures: the kind of language they use, the hashtags they use, and the other users whose content they reshare.

Measuring similarity in hashtags and resharing is relatively straightforward, but such signals are not always available. Language use is the key: it is always present, and a known indicator of people’s latent psychological states.

Using machine-learning techniques we found that people with different ideologies use different kinds of language.

Right-leaning individuals tend to use moral language relating to vice (for example, harm, cheating, betrayal, subversion and degradation), as opposed to virtue (care, fairness, loyalty, authority and sanctity), more than left-leaning individuals. Far-right individuals use grievance language (involving violence, hate and paranoia) significantly more than moderates.

By detecting these signals of ideology, our pipeline can identify and understand the psychological and social characteristics of extreme individuals and communities.

What’s next?

The ideology detection pipeline could be a crucial tool for understanding the spread of far-right ideologies and preventing violence and terrorism. By detecting signals of ideology from user behaviour online, the pipeline serves as an early warning systems for extreme ideology-motivated activity. It can provide law enforcement with methods to flag users for investigation and intervene before radicalisation takes hold.

Read more https://theconversation.com/can-ideology-detecting-algorithms-catch-online-extremism-before-it-takes-hold-200629

Times Magazine

How Decentralised Applications Are Reshaping Enterprise Software in Australia

Australian businesses are experiencing a quiet revolution in how they manage data, execute agreeme...

Bambu Lab P2S 3D Printer Review: High-End Performance Meets Everyday Usability

After a full month of hands-on testing, the Bambu Lab P2S 3D printer has proven itself to be one...

Nearly Half of Disadvantaged Australian Schools Run Libraries on Less Than $1000 a Year

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...

Growing EV popularity is leading to queues at fast chargers. Could a kerbside charger network help?

The war on Iran has made crystal clear how shaky our reliance on fossil fuels is. It’s no surpri...

TRUCKIES UNDER THE PUMP AS FUEL PRICES BECOME TWO THIRDS OF OPERATING COSTS FOR SOME BUSINESS OWNERS

As Australia’s fuel crisis continues, truck drivers across the nation are being hit hard despite t...

iPhone: What are the latest features in iOS 26.5 Beta 1?

Apple has quietly released the first developer beta of iOS 26.5, and while it may not be the hea...

The Times Features

Next stage of works to modernise Port of Devonport

TasPorts is progressing the next stage of its QuayLink program at the Port of Devonport, with up...

‘Cuddle therapy’ sounds like what we all need right now…

Cuddle therapy is having a moment[1]. The idea for this emerging therapy is for you to book in...

The Decentralized DJ: How Play House is Rewriting the M…

The traditional music industry model is currently facing its most significant challenge since the ...

What Australians Use YouTube For

In Australia, YouTube is no longer just a video platform—it is infrastructure. It entertains, e...

Independent MPs warn NDIS funding cuts risk leaving vul…

Federal Independent MPs have called on the Albanese Government to provide greater transparency...

While Fuel Has Our Attention, There Are Many More Issue…

Australia is once again fixated on fuel. Petrol prices rise, headlines follow, political pressu...

Recent outbreaks highlight the risks of bacterial menin…

Outbreaks of bacterial meningococcal disease in England[1] and recent cases in students in New Z...

Nationals leader Matt Canavan promotes work from home t…

Nationals leader Matt Canavan has urged the embrace of work-from-home opportunities as a way to ...

Nearly Half of Disadvantaged Australian Schools Run Lib…

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...