The Times Australia
Google AI
The Times World News

.

AI is creating fake legal cases and making its way into real courtrooms, with disastrous results

  • Written by Michael Legg, Professor of Law, UNSW Sydney & Vicki McNamara, Senior Research Associate, Centre for the Future of the Legal Profession
AI is creating fake legal cases and making its way into real courtrooms, with disastrous results

We’ve seen deepfake, explicit images of celebrities[1], created by artificial intelligence (AI). AI has also played a hand in creating music[2], driverless race cars[3] and spreading misinformation[4], among other things.

It’s hardly surprising, then, that AI also has a strong impact on our legal systems.

It’s well known that courts must decide disputes based on the law, which is presented by lawyers to the court as part of a client’s case. It’s therefore highly concerning that fake law, invented by AI, is being used in legal disputes.

Not only does this pose issues of legality and ethics, it also threatens to undermine faith and trust in global legal systems.

Read more: Lawyers are rapidly embracing AI: here's how to avoid an ethical disaster[5]

How do fake laws come about?

There is little doubt that generative AI is a powerful tool with transformative potential for society, including many aspects of the legal system. But its use comes with responsibilities and risks.

Lawyers are trained to carefully apply professional knowledge and experience, and are generally not big risk-takers. However, some unwary lawyers (and self-represented[6] litigants) have been caught out by artificial intelligence.

ChatGPT on a smartphone screen in front of the same website on a laptop screen
Generative AI tools, like ChatGPT, can provide incorrect information. Shutterstock[7]

AI models are trained on massive data sets. When prompted by a user, they can create new content (both text and audiovisual).

Although content generated this way can look very convincing, it can also be inaccurate. This is the result of the AI model attempting to “fill in the gaps” when its training data is inadequate or flawed, and is commonly referred to as “hallucination[8]”.

In some contexts, generative AI hallucination is not a problem. Indeed, it can be seen as an example of creativity.

But if AI hallucinated or created inaccurate content that is then used in legal processes, that’s a problem – particularly when combined with time pressures on lawyers and a lack of access to legal services for many.

This potent combination can result in carelessness and shortcuts in legal research and document preparation, potentially creating reputational issues for the legal profession and a lack of public trust in the administration of justice.

It’s happening already

The best known generative AI “fake case” is the 2023 US case Mata v Avianca[9], in which lawyers submitted a brief containing fake extracts and case citations to a New York court. The brief was researched using ChatGPT.

The lawyers, unaware that ChatGPT can hallucinate, failed to check that the cases actually existed. The consequences were disastrous. Once the error was uncovered, the court dismissed their client’s case, sanctioned the lawyers for acting in bad faith, fined them and their firm, and exposed their actions to public scrutiny.

Read more: AI is everywhere – including countless applications you've likely never heard of[10]

Despite adverse publicity, other fake case examples continue to surface. Michael Cohen, Donald Trump’s former lawyer, gave his own lawyer cases generated by Google Bard, another generative AI chatbot. He believed they were real (they were not) and that his lawyer would fact check them (he did not). His lawyer included the cases[11] in a brief filed with the US Federal Court.

Fake cases have also surfaced in recent matters in Canada[12] and the United Kingdom[13].

If this trend goes unchecked, how can we ensure that the careless use of generative AI does not undermine the public’s trust in the legal system? Consistent failures by lawyers to exercise due care when using these tools has the potential to mislead and congest the courts, harm clients’ interests, and generally undermine the rule of law.

A man in a suit leaves a courtroom Michael Cohen’s lawyer was caught up in a court case involving fake AI case law. Sarah Yenesel/EPA

What’s being done about it?

Around the world, legal regulators and courts have responded in various ways.

Several US state bars and courts have issued guidance, opinions or orders on generative AI use, ranging from responsible adoption to an outright ban.

Law societies in the UK and British Columbia, and the courts of New Zealand, have also developed guidelines.

In Australia, the NSW Bar Association has a generative AI guide[14] for barristers. The Law Society of NSW[15] and the Law Institute of Victoria[16] have released articles on responsible use in line with solicitors’ conduct rules.

Many lawyers and judges, like the public, will have some understanding of generative AI and can recognise both its limits and benefits. But there are others who may not be as aware. Guidance undoubtedly helps.

But a mandatory approach is needed. Lawyers who use generative AI tools cannot treat it as a substitute for exercising their own judgement and diligence, and must check the accuracy and reliability of the information they receive.

Read more: Do you trust AI to write the news? It already is – and not without issues[17]

In Australia, courts should adopt practice notes or rules that set out expectations when generative AI is used in litigation. Court rules can also guide self-represented litigants, and would communicate to the public that our courts are aware of the problem and are addressing it.

The legal profession could also adopt formal guidance to promote the responsible use of AI by lawyers. At the very least, technology competence should become a requirement of lawyers’ continuing legal education in Australia.

Setting clear requirements for the responsible and ethical use of generative AI by lawyers in Australia will encourage appropriate adoption and shore up public confidence in our lawyers, our courts, and the overall administration of justice in this country.

References

  1. ^ celebrities (www.nytimes.com)
  2. ^ creating music (theconversation.com)
  3. ^ driverless race cars (theconversation.com)
  4. ^ misinformation (theconversation.com)
  5. ^ Lawyers are rapidly embracing AI: here's how to avoid an ethical disaster (theconversation.com)
  6. ^ self-represented (reason.com)
  7. ^ Shutterstock (www.shutterstock.com)
  8. ^ hallucination (www.csiro.au)
  9. ^ Mata v Avianca (law.justia.com)
  10. ^ AI is everywhere – including countless applications you've likely never heard of (theconversation.com)
  11. ^ included the cases (www.reuters.com)
  12. ^ Canada (www.cbc.ca)
  13. ^ the United Kingdom (www.legalfutures.co.uk)
  14. ^ generative AI guide (inbrief.nswbar.asn.au)
  15. ^ Law Society of NSW (lsj.com.au)
  16. ^ Law Institute of Victoria (www.liv.asn.au)
  17. ^ Do you trust AI to write the news? It already is – and not without issues (theconversation.com)

Read more https://theconversation.com/ai-is-creating-fake-legal-cases-and-making-its-way-into-real-courtrooms-with-disastrous-results-225080

Times Magazine

Epson launches ELPCS01 mobile projector cart

Designed for the EB-810E[1] projector and provides easy setup for portable displays in flexible ...

Governance Models for Headless CMS in Large Organizations

Where headless CMS is adopted by large enterprises, governance is the single most crucial factor d...

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner  Rating: ★★★★☆ (4.4/5) Category: Premium Robot ...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

The Times Features

Why Farrer is a key test for One Nation vs the Coalition

The Farrer by-election[1] on May 9 will be a major test for new Liberal leader Angus Taylor and ...

Leader of The Nationals Senator Matt Canavan Rockhampton press conference

Well thank you ladies and gentlemen. Thank you for coming out, this morning and thank you very muc...

Chester to elevate food security issue in Canberra

Elevating the issue of food and fibre security to a matter of national importance will be the prim...

Interior Design Ideas for Open Plan Living Spaces

Open plan living has become one of the most popular layout choices in modern homes. By removing wa...

Matt Canavan is keen on income splitting. Here’s what it would mean for couples

Newly elected Nationals leader Matt Canavan has proposed[1] allowing couples with dependent chil...

Custom Homes vs Project Homes: What’s the Difference?

When building a new home, one of the first and most important decisions you’ll make is whether to ...

Tech companies are blaming massive layoffs on AI. What’s really going on?

In the past few months, a wave of tech corporations have announced significant staff cuts and ...

Berry NSW strikes a new chord as jazz and blues take over the village

Berry NSW will come alive with live blues and jazz performances across multiple venues on Thursday...

Limited-edition gin raises funds for the Easter Bilby

A new limited-edition gin from Brisbane craft distillery BY.ARTISANS is helping support the conserva...