The Times Australia
Google AI
The Times World News

.

When faces are partially covered, neither people nor algorithms are good at reading emotions

  • Written by Harisu Abdullahi Shehu, PhD Researcher, Te Herenga Waka — Victoria University of Wellington

Artificial systems such as homecare robots or driver-assistance technology are becoming more common, and it’s timely to investigate whether people or algorithms are better at reading emotions, particularly given the added challenge brought on by face coverings.

In our recent study[1], we compared how face masks or sunglasses affect our ability to determine different emotions compared with the accuracy of artificial systems.

Our study used full and partial masks and sunglasses to obscure parts of the face. The study used full and partial masks and sunglasses to obscure parts of the face. Author provided

We presented images of emotional facial expressions and added two different types of masks — the full mask used by frontline workers and a recently introduced mask with a transparent window to allow lip reading.

Our findings show algorithms and people both struggle when faces are partially obscured. But artificial systems are more likely to misinterpret emotions in unusual ways.

Artificial systems performed significantly better than people in recognising emotions when the face was not covered — 98.48% compared to 82.72% for seven different types of emotion.

But depending on the type of covering, the accuracy for both people and artificial systems varied. For instance, sunglasses obscured fear for people while partial masks helped both people and artificial systems to identify happiness correctly.

Read more: AI is increasingly being used to identify emotions – here's what's at stake[2]

Importantly, people classified unknown expressions mainly as neutral, but artificial systems were less systematic. They often incorrectly selected anger for images obscured with a full mask, and either anger, happiness, neutral, or surprise for partially masked expressions.

Decoding facial expressions

Our ability to recognise emotion uses the visual system of the brain to interpret what we see. We even have an area of the brain specialised for face recognition, known as the fusiform face area, which helps interpret information revealed by people’s faces.

Together with the context of a particular situation (social interaction, speech and body movement) and our understanding of past behaviours and sympathy towards our own feelings, we can decode how people feel.

A system of facial action units[3] has been proposed for decoding emotions based on facial cues. It includes units such as “the cheek raiser” and “the lip corner puller”, which are both considered part of an expression of happiness.

Study team wearing face masks Can you read the researchers’ emotion from their covered faces? Both artificial systems and people are compromised in categorising emotions when faces are obscured. Author provided

In contrast, artificial systems analyse pixels from images of a face when categorising emotions. They pass pixel intensity values through a network of filters mimicking the human visual system.

The finding that artificial systems misclassify emotions from partially obscured faces is important. It could lead to unexpected behaviours of robots interacting with people wearing face masks.

Imagine if they misclassify a negative emotion, such as anger or sadness, as a positive emotional expression. The artificial systems would try to interact with a person taking actions on the misguided interpretation they are happy. This could have detrimental effects for the safety of these artificial systems and interacting humans.

Risks of using algorithms to read emotion

Our research reiterates that algorithms are susceptible to biases in their judgement. For instance, the performance of artificial systems is greatly affected when it comes to categorising emotion from natural images. Even just the sun’s angle or shade can influence outcomes.

Algorithms can also be racially biased. As previous studies have found, even a small change to the colour[4] of the image, which has nothing to do with emotional expressions, can lead to a drop in performance of algorithms used in artificial systems.

Read more: Face masks and facial recognition will both be common in the future. How will they co-exist?[5]

As if that wasn’t enough of a problem, even small visual perturbations[6], imperceptible to the human eye, can cause these systems to misidentify an input as something else.

Some of these misclassification issues can be addressed. For instance, algorithms can be designed[7] to consider emotion-related features such as the shape of the mouth, rather than gleaning information from the colour and intensity of pixels.

Another way to address this is by changing the training data characteristics[8] — oversampling the training data so that algorithms mimic human behaviour better and make less extreme mistakes when they do misclassify an expression.

But overall, the performance of these systems drops when interpreting images in real-world situations when faces are partially covered.

Although robots may claim higher than human accuracy in emotion recognition for static images of completely visible faces, in real-world situations that we experience every day, their performance is still not human-like.

Read more https://theconversation.com/when-faces-are-partially-covered-neither-people-nor-algorithms-are-good-at-reading-emotions-165005

Times Magazine

AI is failing ‘Humanity’s Last Exam’. So what does that mean for machine intelligence?

How do you translate ancient Palmyrene script from a Roman tombstone? How many paired tendons ...

Does Cloud Accounting Provide Adequate Security for Australian Businesses?

Today, many Australian businesses rely on cloud accounting platforms to manage their finances. Bec...

Freak Weather Spikes ‘Allergic Disease’ and Eczema As Temperatures Dip

“Allergic disease” and eczema cases are spiking due to the current freak weather as the Bureau o...

IPECS Phone System in 2026: The Future of Smart Business Communication

By 2026, business communication is no longer just about making and receiving calls. It’s about speed...

With Nvidia’s second-best AI chips headed for China, the US shifts priorities from security to trade

This week, US President Donald Trump approved previously banned exports[1] of Nvidia’s powerful ...

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

The Times Features

Evil Ray declares war on the sun

Australia's boldest sunscreen brand Australians love the sun. The sun doesn't love them back. Mela...

Resolutions for Renovations? What to do before renovating in 2026

Rolling into the New Year means many Aussies have fresh plans for their homes with renovat...

Designing an Eco Conscious Kitchen That Lasts

Sustainable kitchens are no longer a passing trend in Australia. They reflect a growing shift towa...

Why Sydney Entrepreneur Aleesha Naxakis is Trading the Boardroom for a Purpose-Driven Crown

Roselands local Aleesha Naxakis is on a mission to prove that life is a gift...

New Year, New Keys: 2026 Strategies for First Home Buyers

We are already over midway through January, and if 2025 was anything to go by, this year will be o...

How to get managers to say yes to flexible work arrangements, according to new research

In the modern workplace, flexible arrangements can be as important as salary[1] for some. For ma...

Coalition split is massive blow for Ley but the fault lies with Littleproud

Sussan Ley may pay the price for the implosion of the Coalition, but the blame rests squarely wi...

How to beat the post-holiday blues

As the summer holidays come to an end, many Aussies will be dreading their return to work and st...

One Nation surges above Coalition in Newspoll as Labor still well ahead, in contrast with other polls

The aftermath of the Bondi terror attacks has brought about a shift in polling for the Albanese ...