Public service agencies are being officially advised to take extra care when using AI tools in their recruitment processes.
The Merit Protection Commission has released new guidelines to stress the importance of upholding the merit principle when recruiting, and has cautioned against relying on artificial intelligence to get it right.
“AI-assisted and automated recruitment tools may enable agencies to increase the efficiency of their recruitment processes whilst mitigating some forms of recruitment bias,” Commissioner Linda Waugh states in the guidelines.
“However, there are also a number of risks associated with AI-assisted and automated recruitment tools which may impact on the ultimate fairness and effectiveness of a recruitment process.
“In particular, incorrect or negligent use of AI-assisted and automated recruitment tools can impede the operation of a merit-based recruitment process. Agencies should exercise care when engaging these tools, in order to uphold the merit principle.
“There should be a clear demonstrated connection between the candidate’s qualities being assessed and the qualities required to perform the duties of the job.”
The new guidelines have been published in response to the rising number of recruitment decisions being overturned when challenged.
This year, 66 Australian Public Service agencies were surveyed about their recruitment processes, with 15 agencies responding that they had used AI-assisted and automated tools in the past 12 months.
The commissioner noted that these recruitment tools are expected to become more prevalent in the future, and so providing guidance on how best to use them has become crucial.
Recruitment processes must continue to meet APS employment principles, particularly when it comes to merit.
AI-assisted and automated tools aim to minimise or remove direct human input in the recruitment process, and include the use of resume scanners, video interviews or psychometric tests that are AI reviewed.
Advances in technology and a tightening labour market have led APS agencies to increasingly utilise AI-assisted and automated recruitment tools in their recruitment processes, the guidelines state.
The new guidelines also highlight three ”AI-assisted recruitment myths” surrounding the use of the technology.
The first myth is that all AI-assisted tools on the market have been tested.
“There are limited national and international guidelines on the development of AI-assisted and automated recruitment tools, meaning that the quality of AI-assisted assessments can vary significantly,” the guidelines state.
The second myth is that the AI tools are guaranteed to be completely unbiased.
The commission notes that AI can reproduce the bias of the developers.
“For example, developers may only test the AI on certain population demographics, meaning that the tool may disadvantage diversity cohorts. AI-assisted tools can also contain algorithmic bias that does not reflect the true suitability of a candidate for the role.”
And the third myth is that agencies are not accountable for the decisions that AI makes.
“Agencies are accountable for ensuring their recruitment processes follow the merit principle as outlined in section 10A of the Public Service Act 1999 (Public Service Act),” the guidelines state.
“Agencies must be able to demonstrate the effectiveness of the AI-assisted tool in assessing candidates in accordance with criteria relevant to the position description.
“Those criteria must reflect the work-related qualities genuinely required to perform the relevant duties of the role to meet the merit principle.”