Login or become a Shortlist subscriber

 
 

What recruiters can expect from AI regulation

The overseas experience of AI regulation has many implications for Australian recruiters, whether they're embracing AI tools, planning to, or unaware they're already using them, an expert says.

Mitigating the risk of systemic bias in hiring processes is not the responsibility of developers alone, says organisational psychologist Dr Matthew Neale – a VP at employment screening company Criteria Corp, which submitted expert testimony when New York City was developing its automated employment decision tools (AEDT) law.

The New York City law, which took effect in July, imposes significant compliance obligations on employers, Neale says. Organisations there now need to inform job applicants when they're using AI systems for evaluation, in case individuals want to opt out; employers must also have an independent bias auditor examine their AI system and report on the extent to which it's giving different classifications to people based on race and gender; and publish the results. If, for example, their system is recommending 60% of male applicants but only 40% of female applicants, they need to disclose that on their careers website...

You need to be logged in to read this article.

Subscribers log in here

Having trouble using your subscription? Contact us for help or check our FAQ page here for answers to commonly asked questions.

Non subscribers: Access Shortlist by starting your subscription here.

Haven't seen Shortlist before? For a 28-day free trial sign up here.

Go back to our homepage here.