As Wall Street’s biggest firms tout the many ways artificial intelligence is making their employees better, from tellers helping customers with account issues to investment bankers arranging multibillion-dollar deals, there’s one group they don’t want using AI: job candidates.

During the pandemic, banks began using virtual interviews and online tests to screen applicants. That’s made the recruitment process faster and easier, but, with the rise of generative AI, it’s also opened the door to candidates using ChatGPT to improve their chances of being picked. Now firms are taking steps to stop such AI use, including deploying detection software, while some hopefuls say they’re merely using the same tools they’re expected to employ once hired.

“Goldman’s going to want those candidates to use GenAI in the job. Why not let them use it in the pre-hire application?” said Nathan Mondragon, chief innovation officer at Hirevue Inc., an AI-powered screening platform used by most major US banks, including Goldman Sachs Group Inc. “Their choice is to go down a different route and say it’s restricted, essentially.”

As AI threatens to eliminate entry-level jobs by automating menial tasks, the finance industry is grappling with how it can screen for employees with the critical-thinking skills needed to get the best out of AI tools that are both imperfect and constantly evolving. They’re changing their hiring processes to ensure that their candidates are truly the best and brightest without AI — even if it means using AI themselves to do that. 

The finance industry has a notoriously exhaustive vetting process. The typical interview process at a bank involves an initial screening on an online platform such as Hirevue, which uses AI to shrink the candidate pool. Applicants then undergo technical interviews, where they are quizzed on practical skills and financial knowledge needed for the job. Finally, hopefuls endure a “superday” — a day-long event with multiple back-to-back behavioral and technical interviews — before they receive an ultimate decision. 

Still, with the industry set to pay record bonuses this year, young people are clamoring to get their foot in the door. For some, that means using AI to help them through the process.

TestGorilla, a software company that offers a library of personality and job-specific skills tests, is used by many financial firms to screen candidates before they are interviewed by an actual person. Of the more than 5 million candidates that have been assessed by TestGorilla across industries, around 15% are flagged for “potentially suspicious activity,” a category which includes AI assistance, according to Claudia Baijens, vice president of product. In the finance sector, the rate of suspicious behavior is a few percentage points higher, she said. 

When Meridith Dennes, a recruiter at search firm Prospect Rock Partners, screens candidates to send to financial firms, she looks for indicators that they’re leaning on AI. If someone uses a chatbot to generate an answer to her questions, she can usually tell.

“Anything that sounds overly rehearsed or overly generic to me, I just keep asking deeper questions until I get to the real answer,” she said. If the answers stay generic, Dennes usually passes on the candidate.

Banks want employees who add value beyond the tools they use, she said. One candidate Dennes worked with used AI to research a boutique firm’s deals, and the chatbot gave him incorrect information. He was rejected for the role. “You have to use it as a tool to help you, but not totally rely on it,” Dennes said.

Aidan Swenson, a junior at Bentley University who just landed a finance internship, said he never considered using AI during an interview. Some of his friends who relied on chatbots during the interview process faced a steeper learning curve once they landed jobs, he said. Instead he used AI to help enhance résumé bullet points to better match the job description and had ChatGPT quiz him ahead of the interview to prepare for potential questions.

“I’m not someone that is in the belief that it will fully replace jobs,” Swenson said. “Rather it will replace people that don’t know how to use it with people that do know how to use it.”

Software companies such as Hirevue and TestGorilla have built safeguards to detect AI use, like tracking when applicants switch browser tabs or take too long to respond. TestGorilla also added an “honesty agreement,” which asks applicants to vow not to not use AI to help them in a test. Earlier this year, Goldman Sachs sent candidates, including those seeking investment-banking positions, a letter instructing them to steer clear of any digital assistance during interviews.

“This language is consistent with what we send to any of our campus applicants across all positions,” Jennifer Zuccarelli, a Goldman spokesperson, said in an emailed statement. “We want to hear from our applicants in their own voice.”

Even if an applicant passes the initial screening, banks have also shifted their technical interviews to reward those who don’t use AI too much. 

Firms are giving candidates less time to complete case studies, a type of technical interview in which applicants answer questions about a specific business problem, said Jake Schneider, a recruiter at Selby Jennings. While banks once gave candidates days to provide answers, they’re now asking for that information within hours. Those are then followed by conversations with interviewers, so candidates can walk through their reasoning to show cases were completed without assistance.

“These answers are not necessarily black and white, as they were in the past,” Dennes said.

To test whether an applicant really understands the technicals, firms are going beyond generic modeling or accounting questions, asking students specific questions about a financial model, the rationale behind a deal or where the industry is going, said Patrick Curtis, founder of Wall Street Oasis, a provider of interview- and skills-training courses. 

“They still want to be able to screen for intelligence and for common sense” beyond what large language models provide, Curtis said. “If an LLM spits out something that is nonsensical, that’s the whole point right now of having somebody who understands it at a fundamental level and can actually look at it with a critical eye.”

Superdays, meanwhile, are returning to the traditional in-person format as recruiters try to find out how candidates will fit into their firm’s culture.

“I don’t see anybody getting hired without being met face to face,” Dennes said. Firms will start emphasizing qualitative skills in candidates, she said, because banking at senior levels depends on developing close ties with clients — something AI can’t do. “You just can’t outsource relationships.”

Still, once applicants are hired, they’re expected to be adept at using AI immediately. Royal Bank of Canada Chief Executive Officer Dave McKay said he wants colleges and universities to train students to be skilled at manipulating the LLMs that underlie generative-AI tools. 

The next generation should “come into my firm and they’ll do a more senior role right off the bat, because there’s access to more information they can perform more efficiently and effectively, and they don’t have to close all of the learning gaps Day One,” McKay said in an interview. “They can have more time, because the LLM’s in there in support of them.”

. Read more on Markets by NDTV Profit.