The Times Australia
The Times World News

.
Times Media

.

What will a robot make of your résumé? The bias problem with using AI in job recruitment

  • Written by Melika Soleimani, Senior Data Analyst, Massey University
What will a robot make of your résumé? The bias problem with using AI in job recruitment

The artificial intelligence (AI) revolution has begun[1], spreading to almost every facet of people’s professional and personal lives – including job recruitment.

While artists fear copyright breaches[2] or simply being replaced, business and management are becoming increasingly aware to the possibilities of greater efficiencies in areas as diverse as supply chain management, customer service, product development and human resources (HR) management.

Soon all business areas and operations will be under pressure to adopt AI in some form or another. But the very nature of AI – and the data behind its processes and outputs – mean human biases are being embedded in the technology.

Our research[3] looked at the use of AI in recruitment and hiring – a field that has already widely adopted AI to automate the screening of résumés and to rate video interviews by job applicants.

AI in recruitment promises greater objectivity and efficiency[4] during the hiring process by eliminating human biases and enhancing fairness and consistency in decision making.

But our research shows AI can subtly – and at times overtly – heighten biases. And the involvement of HR professionals may worsen rather than alleviate these effects. This challenges our belief that human oversight can contain and moderate AI.

Magnifying human bias

Although one of the reasons for using AI in recruitment is that it is meant to be to be more objective and consistent, multiple studies[5] have found the technology is, in fact, very likely to be biased[6]. This happens because AI learns from the datasets used to train it. If the data is flawed[7], the AI will be too.

Biases in data can be made worse by the human-created algorithms supporting AI, which often contain human biases in their design[8].

In interviews with 22 HR professionals, we identified two common biases in hiring: “stereotype bias” and “similar-to-me bias”.

Stereotype bias occurs when decisions are influenced by stereotypes about certain groups, such as preferring candidates of the same gender, leading to gender inequality.

“Similar-to-me” bias happens when recruiters favour candidates who share similar backgrounds or interests to them.

These biases, which can significantly affect the fairness of the hiring process, are embedded in the historical hiring data which are then used to train the AI systems. This leads to biased AI.

So, if past hiring practices favoured certain demographics, the AI will continue to do so. Mitigating these biases is challenging because algorithms can infer personal information based on hidden data from other correlated information.

For example, in countries with different lengths of military service for men and women, an AI might deduce gender based on service duration.

This persistence of bias underscores the need for careful planning and monitoring to ensure fairness in both human and AI-driven recruitment processes.

Can humans help?

As well as HR professionals, we also interviewed 17 AI developers. We wanted to investigate how an AI recruitment system could be developed that would mitigate rather than exacerbate hiring bias.

Based on the interviews, we developed a model wherein HR professionals and AI programmers would go back and forth in exchanging information and questioning preconceptions as they examined data sets and developed algorithms.

However, our findings reveal the difficulty in implementing such a model lies in the educational, professional and demographic differences that exist between HR professionals and AI developers.

These differences impede effective communication, cooperation and even the ability to understand each other. While HR professionals are traditionally trained in people management and organisational behaviour, AI developers are skilled in data science and technology.

These different backgrounds can lead to misunderstandings and misalignment when working together. This is particularly a problem in smaller countries such as New Zealand, where resources are limited and professional networks are less diverse.

Does HR know what AI programmers are doing, and vice versa? Getty Images

Connecting HR and AI

If companies and the HR profession want to address the issue of bias in AI-based recruitment, several changes need to be made.

Firstly, the implementation of a structured training programme for HR professionals focused on information system development and AI is crucial. This training should cover the fundamentals of AI, the identification of biases in AI systems, and strategies for mitigating these biases.

Additionally, fostering better collaboration between HR professionals and AI developers is also important. Companies should be looking to create teams that include both HR and AI specialists. These can help bridge the communication gap and better align their efforts.

Moreover, developing culturally relevant datasets is vital for reducing biases in AI systems. HR professionals and AI developers need to work together to ensure the data used in AI-driven recruitment processes are diverse and representative of different demographic groups. This will help create more equitable hiring practices.

Lastly, countries need guidelines and ethical standards for the use of AI in recruitment that can help build trust and ensure fairness. Organisations should implement policies that promote transparency and accountability in AI-driven decision-making processes.

By taking these steps, we can create a more inclusive and fair recruitment system that leverages the strengths of both HR professionals and AI developers.

References

  1. ^ revolution has begun (www.cbsnews.com)
  2. ^ copyright breaches (www.researchgate.net)
  3. ^ research (www.igi-global.com)
  4. ^ greater objectivity and efficiency (www.researchgate.net)
  5. ^ multiple studies (academic.oup.com)
  6. ^ very likely to be biased (www.mdpi.com)
  7. ^ data is flawed (www.mdpi.com)
  8. ^ often contain human biases in their design (www.vox.com)

Read more https://theconversation.com/what-will-a-robot-make-of-your-resume-the-bias-problem-with-using-ai-in-job-recruitment-231174

The Times Features

Will the Wage Price Index growth ease financial pressure for households?

The Wage Price Index’s quarterly increase of 0.8% has been met with mixed reactions. While Australian wages continue to increase, it was the smallest increase in two and a half...

Back-to-School Worries? 70% of Parents Fear Their Kids Aren’t Ready for Day On

Australian parents find themselves confronting a key decision: should they hold back their child on the age border for another year before starting school? Recent research from...

Democratising Property Investment: How MezFi is Opening Doors for Everyday Retail Investors

The launch of MezFi today [Friday 15th November] marks a watershed moment in Australian investment history – not just because we're introducing something entirely new, but becaus...

Game of Influence: How Cricket is Losing Its Global Credibility

be losing its credibility on the global stage. As other sports continue to capture global audiences and inspire unity, cricket finds itself increasingly embroiled in political ...

Amazon Australia and DoorDash announce two-year DashPass offer only for Prime members

New and existing Prime members in Australia can enjoy a two-year membership to DashPass for free, and gain access to AU$0 delivery fees on eligible DoorDash orders New offer co...

6 things to do if your child’s weight is beyond the ideal range – and 1 thing to avoid

One of the more significant challenges we face as parents is making sure our kids are growing at a healthy rate. To manage this, we take them for regular check-ups with our GP...

Times Magazine

The Seamless Transition from Blogs to AI-Enhanced Videos

The stuff we see and do online keeps changing because new technologies and websites pop up. We use different things like words, pictures, sounds, and videos. Blogs are one of the oldest and coolest ways people share their thoughts online. They us...

The Evolution Of TV Over The Years

If you have been around for long enough, you might have seen the tech evolution affecting life. This has significantly influenced the way we get entertained and stay busy. Gone are the days when kids would spend hours playing games in the backyar...

JOLT, Australia’s first free electric vehicle charging network

Seedooh charges up with JOLT   JOLT, Australia’s first free electric vehicle charging network, has partnered with purpose-built technology platform Seedooh to verify all advertising campaigns running across its new 100% Digital Out of Home netw...

9 Hidden iPhone Setting to Secure Your Digital Identity

The rise in phone snatching in London and around the world is a stark reminder that our digital lives are more vulnerable than ever. Most people know to have basic security measures in place such as  two-factor authentication (2FA), regularly upd...

Detailed View on Heavy Duty Tarps for the Lasting Defense for Your Valuables

Heavy-duty tarps ensure your valuables' safety against the harshness of weather and outside elements. This means that this tarp is designed from a polyethylene or vinyl material to ensure that your belongings are safely covered and protected from...

The official ANZ launch of EPOS

Sydney - Following a panel discussion with Australian businessman Mark Bouris and panellists Alyce Tran, Scott Bidmead and Jahan Sheikh from Microsoft EPOS was launched. Attendees experienced their very own EPOS ADAPT headset, and heard all ab...