The Times Australia
The Times World News

.
Times Media

.

Algorithms that predict crime are watching – and judging us by the cards we’ve been dealt

  • Written by Tatiana Dancy, Associate Professor, The University of Melbourne
Algorithms that predict crime are watching – and judging us by the cards we’ve been dealt

Your money, postcode, friends and family can make all the difference to how the criminal system treats you.

The New South Wales police recently scrapped a widely condemned program known as the Suspect Targeting Management Plan[1]. It used algorithmic risk scores to single out “targets”, some as young as ten years old, for police surveillance.

But similar programs remain in place. For instance, Corrective Services NSW uses a statistical assessment tool called LSI-R[2] to predict whether prisoners will reoffend.

“High risk” prisoners receive “high intensity interventions”, and may be denied parole. The risk scores are calculated[3] from facts such as “criminal friends”, family involvement in crime or drugs, financial problems, living in a “high crime neighbourhood” and frequent changes of address.

A predictive algorithm is a set of rules for computers (and sometimes people) to follow, based on patterns in data. Lots has been written about how algorithms discriminate against us[4], from biased search engines to health databases.

In my newly published book, Artificial Justice[5], I argue the use of tools that predict our behaviour based on factors like poverty or family background should worry us, too. If we are punished at all, it should be only for what we have done wrong, not for the cards we have been dealt.

Read more: Biased AI can be bad for your health – here's how to promote algorithmic fairness[6]

Algorithms are watching us

Algorithms generate risk scores used in criminal justice systems all over the world. In the United Kingdom, the OASys[7] (Offender Assessment System) is used as part of the pre-sentence information given to judges – it shapes bail, parole and sentencing decisions. In the United States, a tool known as COMPAS[8] does something similar.

Risk scores are used beyond criminal justice, too, and they don’t always need computers to generate them. A short survey known as the Opioid Risk Tool[9] helps doctors in Australia and across the world decide whether to prescribe pain relief for acute and chronic illness, by predicting whether patients will misuse their medications.

Predictive algorithms literally save lives: they are used to allocate donor organs, triage patients and make urgent medical treatment decisions[10]. But they can also create and sustain unjustified inequalities.

Imagine that we develop an algorithm – “CrimeBuster” – to help police patrol crime “hot spots”. We use data that links crime to areas populated by lower income families. Since we cannot measure “crime” directly, we instead look at rates of arrest.

Yet the fact that arrest rates are high in these areas may just tell us that police spend more time patrolling them. If there is no justification for this practice of intensive policing, rolling out CrimeBuster would give these prejudices the status of policy.

Two people with high-vis vests that have police written on the back walk down a city street.
More police patrols can lead to more arrests, but using that as a proxy for predicting ‘crime’ is a flawed tactic. ChameleonsEye/Shutterstock[11]

Read more: The evidence is in: you can't link imprisonment to crime rates[12]

Algorithms are judging us

The trouble deepens when we use statistics to make predictions about intentional action – the things that we choose to do.

This might be a prediction about whether someone will be a “toxic[13]” employee, commit crimes or abuse drugs.

The factors that influence these predictions are rarely publicised. For the British sentencing algorithm OASys, they include whether someone has been the victim of domestic violence[14].

The American COMPAS system captures parental divorce and childhood abuse[15]. The Opioid Risk Tool asks whether the patient’s family has a history of substance abuse, and whether the patient (if female) has a history of “preadolescent sexual abuse[16]”.

In each case, these facts make it more likely that someone will go to prison, miss out on medical treatment, and so on.

We all want to have the chance to make choices true to who we are, and meet our needs and goals. And we want to be afforded the same choices as other people, rather than be singled out as incapable of choosing well.

When we punish someone because of facts they can’t easily influence, we do just this: we treat that person as if they simply cannot help but make bad choices.

We can’t lock people up just in case

The problem isn’t the use of algorithms per se. In the 19th century, Italian physician Cesare Lombroso[17] argued we could identify “the born criminal” from physical characteristics – a misshapen skull, wide jaw, long limbs or big ears.

Not long after, British criminologist Charles Goring[18] ran with this idea and argued that certain “defective” mental characteristics made “the fate of imprisonment” inevitable.

Algorithms simply make it much harder to see what’s going on in the world of crime risk assessment.

But when we look, it turns out what’s going on is something pretty similar to the Lombroso-Goring vision: we treat people as if they are fated to do wrong, and lock them up (or keep them locked up) just in case.

Public bodies should be required to publish the facts that inform the predictions behind such decisions. Machine learning should only be used if and to the extent that these publication requirements can be met. This makes it easier to have meaningful conversations about where to draw the line.

In the context of criminal justice, that line is clear. We should only deal out harsher penalties for bad behaviour, not other physical, mental or social characteristics. There are plenty of guidelines[19] that take this approach, and this is the line that Australian institutions should toe.

Once penalties for their crime have been applied, prisoners should not be treated differently or locked up for longer because of their friends and family, their financial status or the way in which they’ve been treated at the hands of others.

References

  1. ^ Suspect Targeting Management Plan (piac.asn.au)
  2. ^ statistical assessment tool called LSI-R (criminaljustice.tooltrack.org)
  3. ^ risk scores are calculated (correctiveservices.dcj.nsw.gov.au)
  4. ^ discriminate against us (nyupress.org)
  5. ^ Artificial Justice (academic.oup.com)
  6. ^ Biased AI can be bad for your health – here's how to promote algorithmic fairness (theconversation.com)
  7. ^ OASys (theconversation.com)
  8. ^ tool known as COMPAS (www.propublica.org)
  9. ^ known as the Opioid Risk Tool (nida.nih.gov)
  10. ^ urgent medical treatment decisions (www.theverge.com)
  11. ^ ChameleonsEye/Shutterstock (www.shutterstock.com)
  12. ^ The evidence is in: you can't link imprisonment to crime rates (theconversation.com)
  13. ^ toxic (fama.io)
  14. ^ domestic violence (assets.publishing.service.gov.uk)
  15. ^ childhood abuse (www.michigan.gov)
  16. ^ preadolescent sexual abuse (www.health.vic.gov.au)
  17. ^ Cesare Lombroso (en.wikipedia.org)
  18. ^ Charles Goring (archive.org)
  19. ^ plenty of guidelines (advancingpretrial.org)

Read more https://theconversation.com/algorithms-that-predict-crime-are-watching-and-judging-us-by-the-cards-weve-been-dealt-225798

The Times Features

FedEx Australia Announces Christmas Shipping Cut-Off Dates To Help Beat the Holiday Rush

With Christmas just around the corner, FedEx is advising Australian shoppers to get their presents sorted early to ensure they arrive on time for the big day. FedEx has reveale...

Will the Wage Price Index growth ease financial pressure for households?

The Wage Price Index’s quarterly increase of 0.8% has been met with mixed reactions. While Australian wages continue to increase, it was the smallest increase in two and a half...

Back-to-School Worries? 70% of Parents Fear Their Kids Aren’t Ready for Day On

Australian parents find themselves confronting a key decision: should they hold back their child on the age border for another year before starting school? Recent research from...

Democratising Property Investment: How MezFi is Opening Doors for Everyday Retail Investors

The launch of MezFi today [Friday 15th November] marks a watershed moment in Australian investment history – not just because we're introducing something entirely new, but becaus...

Game of Influence: How Cricket is Losing Its Global Credibility

be losing its credibility on the global stage. As other sports continue to capture global audiences and inspire unity, cricket finds itself increasingly embroiled in political ...

Amazon Australia and DoorDash announce two-year DashPass offer only for Prime members

New and existing Prime members in Australia can enjoy a two-year membership to DashPass for free, and gain access to AU$0 delivery fees on eligible DoorDash orders New offer co...

Times Magazine

The Best Deals for Venue Rentals on a Budget

Are you looking for a cheap venue for your next event? Look no further! In this introduction, we will discuss the benefits of renting a venue near you at an affordable price. Renting a local venue can save you time and money on transportation, as w...

Australians can now recycle unwanted sports balls

It’s predicted that 330 million sports balls are made worldwide each year – but where do they end up? TreadLightly launches new SPORTS BALL recycling program.[1] TreadLightly today launches its new SPORTS BALL recycling program, aimed at reduci...

A Guide to Finding The Best Painters in Sydney

Are you prepared to turn your space into a work of art? Whether it's your home or workplace, professional painting holds undeniable power. Beyond enhancing aesthetics, it adds significant value to your property. This comprehensive guide walks you ...

Latidreams Review: Where Dreams of Love Become Reality

In a digital age where love is but a swipe away, Latidreams.com emerges as a beacon for those yearning for a deeper connection. It's not just another dating site; it's a romantic odyssey waiting to unfold. With a suite of innovative features like L...

From Grease to Gleam: Unveiling the Secrets of Oven Cleaning

A sparkling clean oven is a thing of beauty. It's also a sign of a well-maintained kitchen. But let's be honest: oven cleaning is not exactly the most fun task. It can be time-consuming and messy, and it's easy to get discouraged. But don't despair...

Jabra launches Evolve2 75 headset to re-energise hybrid working

Jabra has announced the release of the latest in its Evolve range of enterprise headsets, the Evolve2 75. With 68% of employees seeing their Ideal work week including a hybrid model of working from home and an office[1], the Evolve2 75 is specifica...