April 15, 2024
A.I

New book exposes how 99% of Fortune 500 companies use technology to “watch” interviews and “read” resumes to make hiring decisions without human supervision

The book, titled ‘The Algorithm’, has highlighted how the world of recruitment is becoming a ‘Wild West’ where unregulated AI algorithms make decisions without human supervision.

AI has taken over the job market by reading resumes and watching interviews to provide human executives with the best candidates, a new book reveals.

The book, titled ‘The Algorithm’, has revealed how the world of recruitment is becoming a ‘Wild West’ where unregulated AI algorithms make decisions without human supervision.

Artificial intelligence decides who hires and who fires monitoring everything from what people post on social media to their tone of voice in interviews, the book’s author, Hilke Schellmann, told DailyMail.com.

Algorithms can now dictate not only who gets job interviews but, thanks to continuous on-the-job monitoring, who gets promoted or fired (and they could even warn your boss if you’re getting divorced).

Schellmann said ZipRecruiter’s CEO told him a few years ago that the technology was analyzing at least 75 percent of resumes.

‘That was in 2021; It’s probably 100 percent now. “We know that 99 percent of Fortune 500 companies already use AI tools in recruiting,” he stated.

Schellman said it’s inevitable that if you apply today, your resume will be vetted by AI long before it’s processed by a human, and he offers tips on how to be seen in an AI-powered workplace.

The AI ​​tools used by recruiters are unreliable and even recruiters do not know how they work, adding that the AI-based job application process is rife with discrimination and “blatantly strange” keywords are used, he explained.

Artificial intelligence is already deciding who gets hired and who gets fired by monitoring everything from what people post on social media to their tone of voice in interviews, the book's author, Hilke Schellmann, told DailyMail.com.

Artificial intelligence is already deciding who gets hired and who gets fired by monitoring everything from what people post on social media to their tone of voice in interviews, the book’s author, Hilke Schellmann, told DailyMail.com.

AI tools are often “black boxes” where recruiters can’t see how they work.

The technology can develop unusual ideas about a candidate’s likelihood of success or predict success based on church attendance or different nationalities, forming a process rife with discrimination.

That means, for example, that women or disabled people could be discriminated against during the hiring process, but because people don’t know what AI tools have been used, they find it difficult to respond.

“Vendors who build AI tools don’t want to be scrutinized and are reluctant to talk about any problems,” Schellmann said.

‘You want to talk about it in enthusiastic marketing terms, right? How wonderful it is to find the best people, but they don’t want to open the black box for testing or transparency.

“Companies that use these AI tools often don’t want to talk about it either because they feel that the fact that applicants are upset that AI is being used and that no human is looking at their job application is damaging to their reputation.”

And he said machines do most of the rejections.

A former labor lawyer, Matthew Scherer, with whom Schellmann spoke, said the tools used “are not ready for prime time.”

This is because the technology is “very basic” and cannot fully predict real-world outcomes, such as a person’s success at a job.

Schellmann described many technologies used to filter resumes as “snake oil.”

‘We know it saves money. “We know it saves labor, but we haven’t seen evidence that it chooses the most qualified candidates,” he said.

Many organizations are also using AI to evaluate video interview recordings, looking for problems such as “wrong tone of voice,” Schellman said.

“Unfortunately, this is largely legal,” he continued.

‘The European Union is a little stricter with the General Data Protection Regulation [GDPR] and other laws, but America is still the Wild West in this, except for some local laws we see.

‘There’s one in Illinois where you have to tell people about this using artificial intelligence and video interviews. But in general there is still not much regulation in this regard.’

Artificial intelligence tools used in interviews extract “biomarkers” (such as tone of voice or movements) that supposedly correspond to emotions.

‘If you and I are talking, this tool can find out if you are anxious or depressed based on the tone of voice or when we see facial expression. Yes, and the intonation of the voice,” Schellmann said.

‘What does a facial expression mean in a job interview?

‘It doesn’t make you good or bad at a job. We use these technological signals because we can, but often they don’t have much meaning,”

Employers now also routinely scan social networks like X and LinkedIn using artificial intelligence algorithms, looking for details like references to songs with violent lyrics.

“That could mean that you are labeled as a violent person and someone who should not be hired,” Schellmann said.

Many companies do this as part of a hiring screening stage, but others continually use these types of employee scans.

“Some of these tools also detect things like whether you are prone to self-harm,” the author revealed.

Algorithms can now dictate not only who gets job interviews but, thanks to continuous on-the-job monitoring, who gets promoted or fired.

Algorithms can now dictate not only who gets job interviews but, thanks to continuous on-the-job monitoring, who gets promoted or fired.

‘In the United States, that could be illegal because you’re not allowed to ask people about their medical conditions before hiring them.

‘It’s also a question: why would a company want to know if you are prone to self-harm? Is that really useful? Are they helping these people, their employees? Or are they being punished?

Companies use artificial intelligence algorithms to assess people’s personalities based on their social media posts, analyzing language to assess what people “really” are like, Schellmann said.

‘Companies want to look under people’s hoods, right? “They want to know who you are before they hire you,” Schellmann explained.

Schellmann said the ability to “look under the hood” of people is something organizations have longed for for decades, leading them to rely on unproven or bogus technologies such as handwriting analysis.

The same work is done by artificial intelligence algorithms that analyze videos of job interviews.

Organizations that want to hire someone who is a “fast learner” (so they can adapt to a changing technological world) often rely on such technologies to predict who might be a good fit, Schellmann said.

But relying on personality (and untested AI algorithms to get people to have that specific trait) is a mistake, Schellman maintains.

Schellman said, “What we know from science is that personality predicts about five percent of success at work.” So that is very, very little.

‘We often overcome our personality. I’m quite shy and I have to work on that. When I go to receptions and parties, I have to make an effort to approach strangers. We can overcome our personalities at work and elsewhere.

‘Actually, it’s questionable whether we should use it. But it is easy to use. It’s super cheap. It’s just an easy way to do it, and that’s why they do it.”

How to be successful when AI reads your resume (and probably watches your job interview)

Schellmann advises that applicants should match 60 to 80 percent of the keywords in the job description (not 100 percent, because AI tools can rule you out for simply copying the job description).

Schellmann said, “You want to have a super basic resume.” The old advice was often to stand out to a human with interesting columns and graphs. Don’t have graphics, as if a machine couldn’t read them: no images, no columns.

Instead, Schellmann advises applicants to have bullet points and clear, machine-readable language that is brief and concise.

He said several website services (including JobScan) can help you see if your resume or application is machine readable.

Upload the job description and a resume to verify overlap.

Today’s job market is a “cat and mouse” game in which applicants often use large AI language models, such as ChatGPT, to write cover letters, Schellmann said.

Schellman said, “Humans don’t actually craft the text and they no longer evaluate cover letters and resumes.” It’s the machines against the machines.

Schellmann advises using AI systems like ChatGPT to assess what questions you’re most likely to be asked, and in non-live interviews, ChatGPT can also help you find answers.

In interviews, if you are going to be evaluated by machines, have long answers that describe specific scenes, Schellman advised, because the shorter the answer, the harder it will be for machines to understand it.

Schellman said some also suggest looking at the camera to show the algorithms you’re “engaged with.”

Schellman said the other key is to apply for as many jobs as possible, even ones you don’t feel qualified for.

‘There is a clear difference between women and men: women only apply when they are 100 percent qualified, while men apply when they are 50 percent qualified. But if the machines qualify whether you are qualified or not, apply when you think you are 60% qualified.’

Schellmann said the answer is to keep applying, even if you have to do it 150 or 200 times.

She said: ‘Don’t be discouraged. It’s like a numbers game and can be very frustrating for people. But it’s just the machines that read it on the other end and put you in a yes or no pile.

—And there is very little control over that? So mass application is the only way to do it. ‘

Leave a Reply

Your email address will not be published. Required fields are marked *