The Times Australia
Fisher and Paykel Appliances
The Times World News

.

why automated systems must put the human factor first

  • Written by Mark Rickerby, Lecturer, School of Product Design, University of Canterbury
why automated systems must put the human factor first

The incident of a woman misidentified by facial recognition technology[1] at a Rotorua supermarket should have come as no surprise.

When Foodstuffs North Island announced its intention to trial this technology in February, as part of a strategy to combat retail crime, technology and privacy experts immediately raised concerns.

In particular, the risk of Māori women and women of colour being discriminated against[2] was raised, and has now been borne out by what happened in early April to Te Ani Solomon.

Speaking to media this week, Solomon said[3] she thought ethnicity was a “huge factor” in her wrongful identification. “Unfortunately, it will be the experience of many Kiwis if we don’t have some rules and regulations around this.”

The supermarket company’s response[4] that this was a “genuine case of human error” fails to address the deeper questions about such use of AI and automated systems.

Automated decisions and human actions

Automated facial recognition is often discussed in the abstract – as pure algorithmic pattern matching, with emphasis on assessing correctness and accuracy.

These are rightfully important priorities for systems that deal with biometric data and security. But with such crucial focus on the results of automated decisions, it’s easy to overlook concerns about how these decisions are applied.

Designers use the term “context of use” to describe the everyday working conditions, tasks and goals of a product. With facial recognition technology in supermarkets, the context of use goes far beyond traditional design concerns such as ergonomics or usability.

Read more: The use of technology in policing should be regulated to protect people from wrongful convictions[5]

It requires consideration of how automated trespass notifications trigger in-store responses, protocols for managing those responses, and what happens when things go wrong. These are more than just pure technology or data problems.

This perspective helps us understand and balance the impact of engineering and design interventions at different levels of a system.

Investing in improving prediction accuracy seems an obvious priority for facial recognition systems. But this has to be seen in a broader context of use where the harm done by a small number of wrong predictions outweighs marginal performance improvements elsewhere.

Responding to retail crime

New Zealand is not alone in reported increases in shoplifting and violent behaviour in stores. In the UK, it has been described as a “crisis”, with assaulting a retail worker now a standalone criminal offence[6].

Canadian police are funnelling extra resources[7] into “shoplifting crackdowns”. And in California, retail giants Walmart and Target are pushing for increased penalties[8] for retail crime.

While these problems have been linked to the rising cost of living, industry group Retail NZ has pointed to[9] profit-seeking organised crime as the major factor.

Read more: Facial recognition technology could soon be everywhere – here's how to make it safer[10]

Sensationalised coverage using security footage of brazen thefts and assaults in stores is undoubtedly influencing public perception. But a trend is difficult to measure due to a lack of consistent, impartial data on shoplifting and offenders.

It is estimated that 15-20% of people in New Zealand are affected by food insecurity, a problem found to be[11] strongly associated with ethnicity and socioeconomic position. The links between cost of living, food insecurity and black market distribution of stolen groceries are likely to be complex and nuanced.

Caution is therefore needed when assessing cause and effect, given the risks of harm and implications for civil society of a shift towards constant surveillance in retail spaces.

AI technologies need ‘humans in the loop’ to avoid bias and error. Getty Images

AI and human bias

Commendably, Foodstuffs has engaged with the Privacy Commissioner, and has been transparent about safeguards[12] in biometric data collection and deletion protocols. What’s missing is more clarity around protocols for the security response in stores.

This is more than about customers consenting to facial recognition cameras. Customers also need to know what happens when a trespass notification is issued, and the dispute resolution process should a misidentification occur.

Read more: Avoiding a surveillance society: how better rules can rein in facial recognition tech[13]

Research suggests human decision makers can inherit biases from AI decisions[14]. In situations of heightened stress and risk of violence, combining automated facial recognition with ad-hoc human judgement is potentially dangerous.

Rather than isolating and blaming individual workers or technology components as single points of failure, there needs to be more emphasis on resilience and tolerance for error across the whole system.

AI errors and human errors cannot be avoided entirely. AI security protocols with “humans in the loop” need more careful safeguards that respect customer rights and protect against stereotyping.

Read more: The secret sauce of Coles’ and Woolworths’ profits: high-tech surveillance and control[15]

Shopping and surveillance

Australian supermarkets have responded to retail crime with overt technological surveillance[16]: body cameras issued to staff (also now adopted[17] by Woolworths in New Zealand), digitally tracking customer movement through stores, automated trolley locks and exit gates to prevent people leaving without paying.

Excerpt from a 1979 IBM training manual. MIT-CSAIL[18]

Supermarkets may now be at the forefront of a technological shift in the shopping experience. Moving towards a surveillance culture where every customer is monitored as a potential thief is reminiscent of the ways global airport security changed after 9/11.

New Zealand product designers, software engineers and data scientists will be paying close attention to the outcome of the Privacy Commissioner’s review of the Foodstuffs facial recognition trial[19].

Theft and violence is an urgent problem for supermarkets to address. But they now need to show that digital surveillance systems are a more responsible, ethical and effective solution than possible alternative approaches.

This means acknowledging technology requires human-centered design to avoid misuse, bias and harm. In turn, this can help guide regulatory frameworks and standards, inform public debate on the acceptable use of AI, and support development of safer automated systems.

References

  1. ^ misidentified by facial recognition technology (www.nzherald.co.nz)
  2. ^ being discriminated against (www.teaonews.co.nz)
  3. ^ Solomon said (www.1news.co.nz)
  4. ^ response (taiuru.co.nz)
  5. ^ The use of technology in policing should be regulated to protect people from wrongful convictions (theconversation.com)
  6. ^ standalone criminal offence (www.gov.uk)
  7. ^ funnelling extra resources (www.mapleridgenews.com)
  8. ^ pushing for increased penalties (www.bnnbloomberg.ca)
  9. ^ Retail NZ has pointed to (www.stuff.co.nz)
  10. ^ Facial recognition technology could soon be everywhere – here's how to make it safer (theconversation.com)
  11. ^ found to be (www.growingup.co.nz)
  12. ^ transparent about safeguards (www.nzherald.co.nz)
  13. ^ Avoiding a surveillance society: how better rules can rein in facial recognition tech (theconversation.com)
  14. ^ inherit biases from AI decisions (www.nature.com)
  15. ^ The secret sauce of Coles’ and Woolworths’ profits: high-tech surveillance and control (theconversation.com)
  16. ^ overt technological surveillance (theconversation.com)
  17. ^ now adopted (www.rnz.co.nz)
  18. ^ MIT-CSAIL (www.csail.mit.edu)
  19. ^ review of the Foodstuffs facial recognition trial (www.privacy.org.nz)

Read more https://theconversation.com/supermarket-facial-recognition-failure-why-automated-systems-must-put-the-human-factor-first-228284

Times Magazine

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

The Times Features

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...

What SMEs Should Look For When Choosing a Shared Office in 2026

Small and medium-sized enterprises remain the backbone of Australia’s economy. As of mid-2024, sma...

Anthony Albanese Probably Won’t Lead Labor Into the Next Federal Election — So Who Will?

As Australia edges closer to the next federal election, a quiet but unmistakable shift is rippli...

Top doctors tip into AI medtech capital raise a second time as Aussie start up expands globally

Medow Health AI, an Australian start up developing AI native tools for specialist doctors to  auto...

Record-breaking prize home draw offers Aussies a shot at luxury living

With home ownership slipping out of reach for many Australians, a growing number are snapping up...