The Times Australia
The Times World News

.

Algorithms can decide your marks, your work prospects and your financial security. How do you know they're fair?

  • Written by Kalervo Gulson, Professor and ARC Future Fellow, Education & Social Work, Education Futures Studio, University of Sydney

Algorithms are becoming commonplace. They can determine employment prospects, financial security[1] and more. The use of algorithms can be controversial – for example, robodebt[2], as the Australian government’s flawed online welfare compliance system came to be known.

Algorithms are increasingly being used to make decisions that have a lasting impact on our current and future lives.

Some of the greatest impacts of algorithmic decision-making are in education. If you have anything to do with an Australian school or a university, at some stage an algorithm will make a decision that matters for you.

So what sort of decisions might involve algorithms? Some decisions will involve the next question for school students to answer on a test, such as the online provision of NAPLAN[3]. Some algorithms support human decision-making in universities[4], such as identifying students at risk of failing a subject. Others take the human out of the loop, like some forms of online exam supervision[5].

Read more: Unis are using artificial intelligence to keep students sitting exams honest. But this creates its own problems[6]

How do algorithms work?

Despite their pervasive impacts on our lives, it is often difficult to understand how algorithms work, why they have been designed, and why they are used. As algorithms become a key part of decision-making in education – and many other aspects of our lives – people need to know two things:

  1. how algorithms work

  2. the kinds of trade-offs that are made in decision-making using algorithms.

In research to explore these two issues, we developed an algorithm game[7] using participatory methodologies to involve diverse stakeholders in the research. The process becomes a form of collective experimentation to encourage new perspectives and insights into an issue.

Our algorithm game is based on the UK exam controversy[8] in 2020. During COVID-19 lockdowns, an algorithm was used to determine grades[9] for students wishing to attend university. The algorithm predicted grades for some students that were far lower than expected. In the face of protests, the algorithm was eventually scrapped.

Read more: Scotland's exam result crisis: assessment and social justice in a time of COVID-19[10]

Our interdisciplinary team[11] co-designed the UK exam algorithm game over a series of two workshops and multiple meetings this year. Our workshops included students, data scientists, ethicists and social scientists. Such interdisciplinary perspectives are vital to understand the range of social, ethical and technical implications of algorithms in education.

Algorithms make trade-offs, so transparency is needed

The UK example highlights key issues with using algorithms in society, including issues of transparency and bias in data. These issues matter everywhere, including Australia[12].

We designed the algorithm game to help people develop the tools to have more of a say in shaping the world algorithms are creating. Algorithm “games” invite people to play with and learn about the parameters of how an algorithm operates. Examples include games that show people how algorithms are used in criminal sentencing[13], or can help to predict fire risk in buildings[14]

Diagram of ingredients, recipe and cake matching the input, procedure and output
For the algorithm game, the authors used the analogy of making a cake – a product of choosing ingredients and mixing and baking them in a certain way – to explain how the inputs and parameters of steps in the process affect the outcome. Author provided (no reuse)

There is a growing public awareness that algorithms, especially those used in forms of artificial intelligence, need to be understood as raising issues of fairness[15]. But while everyone may have a vernacular understanding of what is fair or unfair, when algorithms are used numerous trade-offs are involved.

Read more: From robodebt to racism: what can go wrong when governments let algorithms make the decisions[16]

In our algorithm game, we take people through a series of problems where the solution to a fairness problem simply introduces a new one. For example, the UK algorithm did not work very well for predicting the grades of students in schools where smaller numbers of students took certain subjects. This was unfair for these students.

The solution meant the algorithm was not used for these often very privileged schools[17]. These students then received grades predicted by their teachers. But these grades were mostly higher than the algorithm-generated grades received by students in larger schools, which were more often government comprehensive schools. So this meant the decision was fair for students in small schools, unfair for those in larger schools who had grades allocated by the algorithm.

What we try to show in our game that it is not possible to have a perfect outcome. And that neither humans or algorithms will make a set of choices that are fair for everyone. This means we have to make decisions about which values matter when we use algorithms.

Public must have a say to balance the power of EdTech

While our algorithm game focuses on the use of an algorithm developed by a government, algorithms in education are commonly introduced as part of educational technology. The EdTech industry is expanding rapidly in Australia[18]. Companies are seeking to dominate all stages of education: enrolment, learning design, learning experience and lifelong learning.

Alongside these developments, COVID-19 has accelerated the use of algorithmic decision-making in education and beyond.

Read more: Artificial intelligence holds great potential for both students and teachers – but only if used wisely[19]

While these innovations open up amazing possibilities, algorithms also bring with them a set of challenges we must face as a society. Examples like the UK exam algorithm expose us to how such algorithms work and the kinds of decisions that have to be made when designing them. We are then forced to answer deep questions of which values we will choose to prioritise and what roadmap for research[20] we take forward.

Our choices will shape our future and the future of generations to come.

The following people were also involved in the research underpinning the algorithm game. From the Gradient Institute[21] for responsible AI, Simon O'Callaghan, Alistair Reid and Tiberio Caetano. And from the Tech for Social Good[22] group, Vincent Zhang.

References

  1. ^ financial security (www.afr.com)
  2. ^ robodebt (www.innovationaus.com)
  3. ^ online provision of NAPLAN (nap.edu.au)
  4. ^ human decision-making in universities (theconversation.com)
  5. ^ online exam supervision (theconversation.com)
  6. ^ Unis are using artificial intelligence to keep students sitting exams honest. But this creates its own problems (theconversation.com)
  7. ^ an algorithm game (www.edufuturesstudio.com)
  8. ^ UK exam controversy (www.theverge.com)
  9. ^ algorithm was used to determine grades (blogs.lse.ac.uk)
  10. ^ Scotland's exam result crisis: assessment and social justice in a time of COVID-19 (theconversation.com)
  11. ^ Our interdisciplinary team (education-futures-studio.sydney.edu.au)
  12. ^ Australia (www.sbs.com.au)
  13. ^ criminal sentencing (www.technologyreview.com)
  14. ^ predict fire risk in buildings (automating.nyc)
  15. ^ issues of fairness (www.nature.com)
  16. ^ From robodebt to racism: what can go wrong when governments let algorithms make the decisions (theconversation.com)
  17. ^ very privileged schools (ffteducationdatalab.org.uk)
  18. ^ expanding rapidly in Australia (www.pwc.com.au)
  19. ^ Artificial intelligence holds great potential for both students and teachers – but only if used wisely (theconversation.com)
  20. ^ roadmap for research (www.nuffieldfoundation.org)
  21. ^ Gradient Institute (gradientinstitute.org)
  22. ^ Tech for Social Good (www.techforsocialgood.org)

Read more https://theconversation.com/algorithms-can-decide-your-marks-your-work-prospects-and-your-financial-security-how-do-you-know-theyre-fair-171590

Times Magazine

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

From Beach Bops to Alpine Anthems: Your Sonos Survival Guide for a Long Weekend Escape

Alright, fellow adventurers and relaxation enthusiasts! So, you've packed your bags, charged your devices, and mentally prepared for that glorious King's Birthday long weekend. But hold on, are you really ready? Because a true long weekend warrior kn...

Effective Commercial Pest Control Solutions for a Safer Workplace

Keeping a workplace clean, safe, and free from pests is essential for maintaining productivity, protecting employee health, and upholding a company's reputation. Pests pose health risks, can cause structural damage, and can lead to serious legal an...

The Times Features

Distressed by all the bad news? Here’s how to stay informed but still look after yourself

If you’re feeling like the news is particularly bad at the moment, you’re not alone. But many of us can’t look away – and don’t want to. Engaging with news can help us make ...

The Role of Your GP in Creating a Chronic Disease Management Plan That Works

Living with a long-term condition, whether that is diabetes, asthma, arthritis or heart disease, means making hundreds of small decisions every day. You plan your diet against m...

Troubleshooting Flickering Lights: A Comprehensive Guide for Homeowners

Image by rawpixel.com on Freepik Effectively addressing flickering lights in your home is more than just a matter of convenience; it's a pivotal aspect of both home safety and en...

My shins hurt after running. Could it be shin splints?

If you’ve started running for the first time, started again after a break, or your workout is more intense, you might have felt it. A dull, nagging ache down your shins after...

Metal Roof Replacement Cost Per Square Metre in 2025: A Comprehensive Guide for Australian Homeowners

In recent years, the trend of installing metal roofs has surged across Australia. With their reputation for being both robust and visually appealing, it's easy to understand thei...

Why You’re Always Adjusting Your Bra — and What to Do Instead

Image by freepik It starts with a gentle tug, then a subtle shift, and before you know it, you're adjusting your bra again — in the middle of work, at dinner, even on the couch. I...