AI and facial analysis used in job interviews for the "first time"
Privacy campaigners say it has "chilling implications" for job seekers as there's a "high likelihood" that it may be biased
Artificial intelligence and facial analysis technology are being applied to video interviews for UK job seekers for the first time, according to reports.
This has been met with concern from privacy campaigners warning that it risks discriminating against worthy applicants with biased algorithms.
The software is reportedly being provided by US firm Hirevue, which analyses the tone of voice, vocabulary and facial expressions in video interviews to determine a candidate's suitability.
The firm works with a number of high profile companies, such as Unilever, Honeywell and Intel and claims that its technology speeds up the interview process by analysing candidates in the initial stages via video rather than just using CVs.
On its website, Hirevue describes its service as "combining a video with game-based challenges". These can be completed in 30-minutes and use AI to collect tens of thousands of data points. This, it says, enables a more reliable and objective indicator of future performance without human bias.
"Facial expressions indicate certain emotions, behaviours and personality traits," Nathan Mondragon, Hirevue's chief psychologist, said to The Telegraph.
"We get about 25,000 data points from 15 minutes of video per candidate. The text, the audio and the video come together to give us a very clear analysis and rich data set of how someone is responding, the emotions and cognitions they go through."
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
The company said it has already used its technology for 100,000 interviews in the UK, but questions are being raised about the ethicality of the service and, in particular, the data sets the company is using. They say it will discriminate against certain backgrounds and unconventional candidates.
"Using a faceless artificial intelligence system to conduct tens of thousands of interviews has really chilling implications for job seekers," Griff Ferris, legal and policy officer for Big Brother Watch. "This algorithm will be attempting to recognise and measure the extreme complexities of human speech, body language and expression, which will inevitably have a detrimental effect on unconventional applicants.
"As with many of these systems, unless the algorithm has been trained on an extremely diverse dataset there's a very high likelihood that it may be biased in some way, resulting in candidates from certain backgrounds being unfairly excluded and discriminated against."
There are a number of cases that back up these concerns, such as Amazon's attempts to automate recruitment by using its current and previous employees as a dataset. The HR system sifted through CVs in the hope that it would pick top candidates, but it quickly became apparent it was excluding female applicants. Amazon's workforce was 40% female, with very few of those in managerial positions, so the software interpreted that as a reason not to hire them.
There is a touch of de ja vue with Hirevue as it ranks applicants on a scale of one to 100 against a database of traits of previous "successful" candidates.
IT Pro has contacted both Intel and Unilever for comment.
Bobby Hellard is ITPro's Reviews Editor and has worked on CloudPro and ChannelPro since 2018. In his time at ITPro, Bobby has covered stories for all the major technology companies, such as Apple, Microsoft, Amazon and Facebook, and regularly attends industry-leading events such as AWS Re:Invent and Google Cloud Next.
Bobby mainly covers hardware reviews, but you will also recognize him as the face of many of our video reviews of laptops and smartphones.