When implemented responsibly, local employment experts say the use of artificial intelligence in the hiring process might actually work to humanize the task. Wauwatosa-based HR consulting firm Cielo uses AI to take on routine administrative tasks. This means employees can instead focus on strategic work that requires human insight. Kirsten Mayer
When implemented responsibly, local employment experts say the use of artificial intelligence in the hiring process might actually work to humanize the task.
Wauwatosa-based HR consulting firm Cielo uses AI to take on routine administrative tasks. This means employees can instead focus on strategic work that requires human insight.
[caption id="attachment_619365" align="alignright" width="300"] Kirsten Mayer[/caption]
“I think the way I would describe AI is infrastructure,” said Kirsten Mayer, chief human resource officer at Cielo. “It’s not about replacing people – it’s helping recruiters by freeing them up to do what they do best.”
Employers often deal with a high volume of applications. Sifting through them all without the support of technology can lead to delays in candidate response time, Mayer explained.
In addition to expediting the process for applicants, AI is helping Cielo screen the best candidates. The company has found value in introducing an AI chatbot to help answer common candidate questions.
Cielo is also using an AI agent to help interview candidates, a process a candidate would need to opt in to ahead of time. When trained properly, Mayer said the use of AI can help eliminate some instances of bias.
“When a bot is reading a CV, a candidate’s age, gender, ethnicity – anything from a diversity perspective – it doesn’t matter,” said Mayer. “The AI just compares what people’s experience looks like. In a human-driven process, you can’t avoid bias.”
Still, companies have the responsibility of considering data privacy when using an AI tool. Legal requirements vary by country, so companies can’t take their North American processes and simply replicate them globally.
Businesses should also have their technological and legal experts examine any third-party AI platform being considered for use. If a third-party platform is non-compliant with federal guidelines, the company using that platform can be held liable. This means humans will always need to be involved in the recruiting process, Mayer said.
“There is a lot of fear related to AI and hiring and bringing in those tools,” she said. “I would want companies to look at it from a different perspective and realize AI will support people in their roles.”
Finding the right balance
Brookfield-based staffing agency QPS Employment Group uses all kinds of technology to put people to work, including AI.
The trick in doing so is balancing the desire to give each candidate a positive experience while increasing the overall efficiency of the hiring process and the capacity of the QPS hiring team.
[caption id="attachment_616513" align="alignright" width="300"] Ryan Festerling[/caption]
“Sometimes those things fight each other, but the goal is to find the sweet spot between those three,” said Ryan Festerling, CEO of QPS Employment Group.
He also believes the best uses for AI are currently at the beginning of the hiring process. QPS is leveraging AI to match the 2 million people in its database to the most appropriate job description.
Additional AI tools such as a chat bot or recruiter can further refine candidates by asking them simple questions. This frees recruiters to work on more complex hiring issues and to forge more personal connections with candidates.
“Will you need recruiters in the future? Absolutely. Does a recruiter role look different in the next 12 to 18 months? It probably does,” said Festerling.
QPS does not allow AI to make any final decisions regarding employment, rather using it to point recruiters in the right direction. The company relies more heavily on its own internal AI tools due to the rich data set readily available, Festerling explained.
“We trust our own information more than public information,” he said. “If you’re a company that’s been around a while, you have a rich database to re-engage people who you may have been in contact with six months, 12 months, or even 12 years ago.”
Festerling emphasized the importance of vetting your data and verifying how an AI tool has made a certain recommendation. Feeding biased data into an LLM (Large Language Model) could create an even bigger problem for a company.
“If you can create some not-so-great practices as a human being, you are now creating massive amounts of those same issues at scale and speed (using AI),” said Festerling. “That’s why we are very cautious.”
Low-risk tasks
Using AI when hiring continues to be a hot topic for Husch Blackwell’s clients, according to Laura Malugade, partner at the law firm’s Milwaukee office.
Recent legal disputes, including a class action lawsuit targeting Workday for its use of AI screening tools, have shined an even brighter spotlight on the practice.
[caption id="attachment_619366" align="alignright" width="300"] Laura Malugade[/caption]
“I think we’ve evolved from most companies staying away from the use of AI in hiring because of the risks, which still is probably advisable, to companies saying, ‘we’re going to do this, but how do we do it in the right way?’” said Malugade.
Using AI to rank, sort or recommend candidates – or to make any sort of employment decisions – will always carry a “fairly significant” amount of risk, she explained.
The two biggest issues with AI tools are that users can never truly be sure how a decision is being made, and the data used to train these tools could be biased if not properly vetted.
“Using it to weed out candidates, that could be a problem without human review,” said Malugade. “Even then, you must consider how it makes the recommendations.”
Companies looking to use AI in the hiring process must make sure a human is involved in the process. Malugade cautioned that popular tools like ChatGPT are programmed to give users the answers they want, so users must learn to be more adversarial and critical when using the technology.
Husch Blackwell helps companies complete vendor due diligence, so they can figure out what questions they might need to ask and what their potential risks are.
There are use cases for employers that have lower risks. Tasks like drafting a job description or an internal email are good ways to use AI to expedite the hiring process, Malugade explained.
From an employee’s perspective, AI tools have made the workforce smarter when it comes to hiring issues. Candidates can drop their exact hiring scenario into ChatGPT and ask if they have any potential legal claims.
Lawyers are seeing seldom-used legal terms pop up in claims, a sign that people are using AI tools to ask for help, according to Malugade.
“Certainly, more education within the workforce is a good thing, but it also means that our employers have to really step it up and be on top of things,” she said.
Eventually, AI could be a useful tool that helps improve the hiring experience for both candidates and companies.
AI is still the best way for companies and recruiters to address the sheer volume of applications coming in, Festerling explained.
With today’s technology, it’s easier than ever for people to apply for hundreds of jobs at a time. It’s a numbers game for both parties involved, which can cause friction.
“You might have a candidate who’s like, this is my dream job, and you may have a candidate right next to that person who applied for 250 jobs and will ghost you,” said Festerling. “Companies want more candidates, but efficiency and personalization will continue to be under fire.”
Holiday flash sale!
Limited time offer. New subscribers only.
Subscribe to BizTimes Milwaukee and save 40%
Holiday flash sale! Subscribe to BizTimes and save 40%!