4 December 2022

Use AI in recruitment with caution, public service told

| Chris Johnson
Join the conversation
9
artificial intelligence

APS agencies have been cautioned against over-reliance on AI tools when recruiting. Image: File.

Public service agencies are being officially advised to take extra care when using AI tools in their recruitment processes.

The Merit Protection Commission has released new guidelines to stress the importance of upholding the merit principle when recruiting, and has cautioned against relying on artificial intelligence to get it right.

“AI-assisted and automated recruitment tools may enable agencies to increase the efficiency of their recruitment processes whilst mitigating some forms of recruitment bias,” Commissioner Linda Waugh states in the guidelines.

“However, there are also a number of risks associated with AI-assisted and automated recruitment tools which may impact on the ultimate fairness and effectiveness of a recruitment process.

“In particular, incorrect or negligent use of AI-assisted and automated recruitment tools can impede the operation of a merit-based recruitment process. Agencies should exercise care when engaging these tools, in order to uphold the merit principle.

“There should be a clear demonstrated connection between the candidate’s qualities being assessed and the qualities required to perform the duties of the job.”

The new guidelines have been published in response to the rising number of recruitment decisions being overturned when challenged.

READ ALSO More public servants asking for reviews of promotion decisions

This year, 66 Australian Public Service agencies were surveyed about their recruitment processes, with 15 agencies responding that they had used AI-assisted and automated tools in the past 12 months.

The commissioner noted that these recruitment tools are expected to become more prevalent in the future, and so providing guidance on how best to use them has become crucial.

Recruitment processes must continue to meet APS employment principles, particularly when it comes to merit.

AI-assisted and automated tools aim to minimise or remove direct human input in the recruitment process, and include the use of resume scanners, video interviews or psychometric tests that are AI reviewed.

Advances in technology and a tightening labour market have led APS agencies to increasingly utilise AI-assisted and automated recruitment tools in their recruitment processes, the guidelines state.

The new guidelines also highlight three ”AI-assisted recruitment myths” surrounding the use of the technology.

The first myth is that all AI-assisted tools on the market have been tested.

“There are limited national and international guidelines on the development of AI-assisted and automated recruitment tools, meaning that the quality of AI-assisted assessments can vary significantly,” the guidelines state.

The second myth is that the AI tools are guaranteed to be completely unbiased.

READ ALSO Services Australia lays off hundreds of contractors before Christmas

The commission notes that AI can reproduce the bias of the developers.

“For example, developers may only test the AI on certain population demographics, meaning that the tool may disadvantage diversity cohorts. AI-assisted tools can also contain algorithmic bias that does not reflect the true suitability of a candidate for the role.”

And the third myth is that agencies are not accountable for the decisions that AI makes.

“Agencies are accountable for ensuring their recruitment processes follow the merit principle as outlined in section 10A of the Public Service Act 1999 (Public Service Act),” the guidelines state.

“Agencies must be able to demonstrate the effectiveness of the AI-assisted tool in assessing candidates in accordance with criteria relevant to the position description.

“Those criteria must reflect the work-related qualities genuinely required to perform the relevant duties of the role to meet the merit principle.”

Join the conversation

9
All Comments
  • All Comments
  • Website Comments
LatestOldest

I was puzzled by the high number of job applications using “selection criteria words”, until an applicant told me that some agencies were using AI to shortlist applications, and they looked for those words (cultivates productive working relationships, communicates with influence etc etc, utter crap of course). So basically if you could pepper your statement with buzzwords, you got an interview. Merit principle, nope.

Another myth is that discrimination happens at the shortlisting stage. In my experience, it doesn’t. It’s about the statement and CV showing suitable skills and experience.
Where bias/discrimination happens is at the interview stage. Two biggies I’ve observed: people of CALD backgrounds whose speech is difficult to understand, to the point that the panel has to discuss what they think the person said. That’s not discrimination, that’s basic communication.
The second, though, is pervasive. Applicant looks great on paper, walks in and, oh no. Grey hair and wrinkles. The panel loses interest, and doesn’t even listen. And they don’t know they’re doing it. Sometimes they literally forget the applicant later, and are surprised to see the name on the selection report.

William Newby9:08 pm 05 Dec 22

Other than meeting basic application criteria AI falls flat. Finding the right human for the job is highly subjective, computers can’t do this work.

Good to see that the myths about AI tools are mentioned. These same myths apply to a wide range of recruitment practices that the public service uses, including some of the psychometric tools applied. They need to be clear on the scientific basis of their methods and not just take it on trust that the salesperson / recruiter is telling the truth. Often they don’t understand the statistical or theoretical basis of their methods, nor do they care, as it’s all about winning the job. KPIs are sales & revenue, not about the quality of the data sold.

I’ve been watching an expensive executive recruitment person (in another state) using this psychometric snake oil in some executive recruitments. It’s utterly useless, but none of the panels ever question its worth. They don’t know how to use the results, misinterpret them… they often think the assessment is a psychologist doing an assessment, when in fact is more like myers-briggs, the applicant indicated preferences. The results contribute nothing to the recruitment, they’re just there to pad out the final report to help justify the eye-watering fee.

Unfortunately many on government panels are un or underskilled for their role and the expensive recruitment consultant is a CYA (cover your arse) exercise. They can blame the consultant for getting it wrong and not take any responsibility themselves. The reality is that no due diligence is done on the consultants, their skills, knowledge or qualifications. That should be part of the procurement & contract management.

Sadly, the public service has a very poor record in recruitment because they do not use scientifically sound methods. Not wanting to take responsibility for their decisions, they outsource the job to people and organisations without any scientific training in assessment and selection. Easily conned by slick sales people, they continue to fall for the spiels of all of the charlatons out there, whether recruitment companies, AI or technology companies.

If they bothered to do their due diligence, they would realise that recruiters are sales people, hired by recruitment firms because of their ability to sell the firm’s services and sell their products (candidates) to employers, as well as to sell jobs to candidates. They even allow the recruiter to choose the candidates for them, trusting their judgement despite the lack of training. They seem not to notice the conflict of interest, when a recruiter is not paid for the service they provide, but for selling them an applicant they will accept, saying they’re the best person for the job.

Tom Worthington4:59 pm 05 Dec 22

One way to to try to prevent bias is to leave out identifying details of applicants, such as their name, address and gender. This applies to manual processing, as much as when using AI. Also details of where the applicant obtained qualifications can lead to bias, so it would be best to just record if the applicant had the required qualification or not, leaving out which university (or TAFE) they got it at.

This won’t work. There’s little bias happening at the shortlisting stage. The bias happens at interview stage. Words on a bit of paper are bland and boring, but at the interview, no one hires someone they don’t like.

The bias begins even before the advertising with the construction of the selection criteria, then in the advertising, followed by the reviews of the applications. People jump to conclusions even before interviews, when deciding who to call in and who to ignore.

Daily Digest

Want the best Canberra news delivered daily? Every day we package the most popular Riotact stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.