Google AI
The Times Australia
The Times World News

.

How better rules can rein in facial recognition tech

  • Written by Nicholas Davis, Industry Professor of Emerging Technology and Co-Director, Human Technology Institute, University of Technology Sydney
how better rules can rein in facial recognition tech

The human face is special. It is simultaneously public and personal. Our faces reveal sensitive information about us: who we are, of course, but also our gender, emotions, health status and more.

Lawmakers in Australia, like those around the world, never anticipated our face data would be harvested on an industrial scale, then used in everything from our smartphones to police CCTV cameras. So we shouldn’t be surprised that our laws have not kept pace with the extraordinary rise of facial recognition technology.

But what kind of laws do we need? The technology can be used for both good and ill, so neither banning it nor the current free-for-all seem ideal.

However, regulatory failure has left our community vulnerable to harmful uses of facial recognition. To fill the legal gap, we propose a “model law[1]”: an outline of legislation that governments around Australia could adopt or adapt to regulate risky uses of facial recognition while allowing safe ones.

The challenge of facial recognition technologies

The use cases for facial recognition technologies seem limited only by our imagination. Many of us think nothing of using facial recognition to unlock our electronic devices. Yet the technology has also been trialled or implemented throughout Australia in a wide range of situations, including schools[2], airports[3], retail stores[4], clubs and gambling venues[5], and law enforcement[6].

As the use of facial recognition grows at an estimated 20%[7] annually, so too does the risk to humans – especially in high-risk contexts like policing.

In the US, reliance on error-prone facial recognition tech has resulted in numerous instances of injustice, especially involving Black people. These include the wrongful arrest and detention of Robert Williams[8], and the wrongful exclusion of a young Black girl[9] from a roller rink in Detroit.

Read more: Facial recognition is on the rise – but the law is lagging a long way behind[10]

Many of the world’s biggest tech companies – including Meta[11], Amazon[12] and Microsoft[13] – have reduced or discontinued their facial recognition-related services. They have cited concerns about consumer safety and a lack of effective regulation.

This is laudable, but it has also prompted a kind of “regulatory-market failure”. While those companies have pulled back, other companies with fewer scruples have taken a bigger share of the facial recognition market.

Take the American company Clearview AI. It scraped billions of face images from social media and other websites without the consent of the affected individuals, then created a face-matching service that it sold to the Australian Federal Police and other law enforcement bodies around the world.

Read more: Australian police are using the Clearview AI facial recognition system with no accountability[14]

In 2021, the Australian Information & Privacy Commissioner found that both Clearview AI[15] and the AFP[16] had breached Australia’s privacy law, but enforcement actions like this are rare.

However, Australians want better regulation of facial recognition. This has been shown in the Australian Human Rights Commission’s 2021 report[17], the 2022 CHOICE investigation[18] into the use of facial recognition technology by major retailers, and in research we at the Human Technology Institute have commissioned as part of our model law[19].

Options for facial recognition reform

What options does Australia have? The first is to do nothing. But this would mean accepting we will be unprotected from harmful use of facial recognition technologies, and keep us on our current trajectory towards mass surveillance.

Read more: Large-scale facial recognition is incompatible with a free society[20]

Another option would be to ban facial recognition tech altogether. Some jurisdictions have indeed instituted moratoriums on the technology, but they contain many exceptions (for positive uses), and are at best a temporary solution.

In our view, the better reform option is a law to regulate facial recognition technologies according to how risky they are. Such a law would encourage facial recognition with clear public benefit, while protecting against harmful uses of the technology.

A risk-based law for facial recognition technology regulation

Our model law would require anyone developing or deploying facial recognition systems in Australia to conduct a rigorous impact assessment to evaluate the human rights risk.

As the risk level increases, so too would the legal requirements or restrictions. Developers would also be required to comply with a technical standard for facial recognition, aligned with international standards for AI performance and good data management.

The model law contains a general prohibition on high-risk uses of facial recognition applications. For example, a “facial analysis” application that purported to assess individuals’ sexual orientation and then make decisions about them would be prohibited. (Sadly, this is not a far-fetched hypothetical[21].)

The ‘model law’ for facial recognition would assess the risk of various applications and apply controls accordingly. Bernard Hermant / Unsplash[22]

The model law also provides three exceptions to the prohibition on high-risk facial recognition technology:

  1. the regulator could permit a high-risk application if it considers the application to be justified under international human rights law

  2. there would be a specific legal regime for law enforcement agencies, including a “face warrant” scheme that would provide independent oversight as with other such warrants

  3. high-risk applications may be used in academic research, with appropriate oversight.

Review by the regulator and affected individuals

Any law would need to be enforced by a regulator with appropriate powers and resources. Who should this be?

The majority of the stakeholders we consulted – including business users, technology firms and civil society representatives – proposed the Office of the Australian Information Commissioner (OAIC) would be well suited to be the regulator of facial regulation. For certain, sensitive users – such as the military and certain security agencies – there may also need to be a specialised oversight regime.

The moment for reform is now

Never have we seen so many groups and individuals from across civil society, industry and government so engaged and aligned on the need for facial recognition technology reform. This is reflected in support for the model law from both the Technology Council of Australia and CHOICE.

Given the extraordinary rise of uses of facial recognition, and an emerging consensus among stakeholders, the federal attorney-general should seize this moment and lead national reform. The first priority is to introduce a federal bill – which could easily be based on the our model law. The attorney-general should also collaborates with the states and territories to harmonise Australian law on facial recognition.

This proposed reform is important on its own terms: we cannot allow facial recognition technologies to remain effectively unregulated. It would also demonstrate how Australia can use law to protect against harmful uses of new technology, while simultaneously incentivising innovation for public benefit.

More information about the model law can be found in our report Facial recognition technology: Towards a model law[23].

References

  1. ^ model law (www.uts.edu.au)
  2. ^ schools (www.theage.com.au)
  3. ^ airports (www.abf.gov.au)
  4. ^ retail stores (www.abc.net.au)
  5. ^ gambling venues (www.cbs.sa.gov.au)
  6. ^ law enforcement (www.police.nsw.gov.au)
  7. ^ estimated 20% (www.mordorintelligence.com)
  8. ^ arrest and detention of Robert Williams (www.nytimes.com)
  9. ^ exclusion of a young Black girl (www.cnet.com)
  10. ^ Facial recognition is on the rise – but the law is lagging a long way behind (theconversation.com)
  11. ^ Meta (about.fb.com)
  12. ^ Amazon (www.reuters.com)
  13. ^ Microsoft (azure.microsoft.com)
  14. ^ Australian police are using the Clearview AI facial recognition system with no accountability (theconversation.com)
  15. ^ Clearview AI (www.oaic.gov.au)
  16. ^ the AFP (www.oaic.gov.au)
  17. ^ Australian Human Rights Commission’s 2021 report (tech.humanrights.gov.au)
  18. ^ 2022 CHOICE investigation (www.choice.com.au)
  19. ^ model law (www.uts.edu.au)
  20. ^ Large-scale facial recognition is incompatible with a free society (theconversation.com)
  21. ^ far-fetched hypothetical (www.washingtonpost.com)
  22. ^ Bernard Hermant / Unsplash (unsplash.com)
  23. ^ Facial recognition technology: Towards a model law (www.uts.edu.au)

Read more https://theconversation.com/avoiding-a-surveillance-society-how-better-rules-can-rein-in-facial-recognition-tech-191075

Times Magazine

How Decentralised Applications Are Reshaping Enterprise Software in Australia

Australian businesses are experiencing a quiet revolution in how they manage data, execute agreeme...

Bambu Lab P2S 3D Printer Review: High-End Performance Meets Everyday Usability

After a full month of hands-on testing, the Bambu Lab P2S 3D printer has proven itself to be one...

Nearly Half of Disadvantaged Australian Schools Run Libraries on Less Than $1000 a Year

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...

Growing EV popularity is leading to queues at fast chargers. Could a kerbside charger network help?

The war on Iran has made crystal clear how shaky our reliance on fossil fuels is. It’s no surpri...

TRUCKIES UNDER THE PUMP AS FUEL PRICES BECOME TWO THIRDS OF OPERATING COSTS FOR SOME BUSINESS OWNERS

As Australia’s fuel crisis continues, truck drivers across the nation are being hit hard despite t...

iPhone: What are the latest features in iOS 26.5 Beta 1?

Apple has quietly released the first developer beta of iOS 26.5, and while it may not be the hea...

The Times Features

Next stage of works to modernise Port of Devonport

TasPorts is progressing the next stage of its QuayLink program at the Port of Devonport, with up...

‘Cuddle therapy’ sounds like what we all need right now…

Cuddle therapy is having a moment[1]. The idea for this emerging therapy is for you to book in...

The Decentralized DJ: How Play House is Rewriting the M…

The traditional music industry model is currently facing its most significant challenge since the ...

What Australians Use YouTube For

In Australia, YouTube is no longer just a video platform—it is infrastructure. It entertains, e...

Independent MPs warn NDIS funding cuts risk leaving vul…

Federal Independent MPs have called on the Albanese Government to provide greater transparency...

While Fuel Has Our Attention, There Are Many More Issue…

Australia is once again fixated on fuel. Petrol prices rise, headlines follow, political pressu...

Recent outbreaks highlight the risks of bacterial menin…

Outbreaks of bacterial meningococcal disease in England[1] and recent cases in students in New Z...

Nationals leader Matt Canavan promotes work from home t…

Nationals leader Matt Canavan has urged the embrace of work-from-home opportunities as a way to ...

Nearly Half of Disadvantaged Australian Schools Run Lib…

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...