Employers have seen an alarming rise in fraudulent job applicants—individuals impersonating other people during the interview process. There are key steps that HR departments can take to filter out the fraudsters.
Exploiting a Crisis
As the COVID-19 pandemic forced many employers to shift to remote or hybrid work, job interviews also largely moved online. The practice has continued, with 44% of employers in a 2021 survey saying they have conducted virtual interviews. For white-collar jobs, online interviews are especially prevalent. This has created a window of opportunity for applicants who aren't quite what they seem.
In June 2022, the FBI issued a warning that scammers have been using a combination of deepfake technology and stolen personally identifiable information (PII) to impersonate real people in interviews. Deepfake technology uses artificial intelligence and deep learning to make someone appear as another person. These fake interviewees generally have a goal of landing a job for illicit purposes, such as stealing data or releasing ransomware.
One way that fraudsters have been able to access real job candidates' PII is by posting fake job ads. Candidates willingly hand over their PII, assuming the jobs are legitimate.
Perhaps most worrying is a 2022 warning issued by the State Department, Treasury Department and FBI that described North Korean workers impersonating non-North Korean nationals to gain employment in IT departments worldwide. The advisory said the North Korean government has been deploying these workers to sell stolen data, provide support for cyberattacks, and assist in the funding and procurement of weapons of mass destruction.
Less maliciously, a different type of fake applicant has also emerged. Job seekers have hired people to stand in for them during virtual interviews. Typically, candidates who aren't qualified find someone who has the necessary skills and gets them to do the interviews. It's worked in some instances—at least until the real applicant reports for work and is found out.
Many fake applicants are pursuing open tech positions. Employers therefore need to be especially vigilant now, given the recent layoffs in Big Tech. There are thousands of workers who are now seeking new employment and who could be targets for impersonation. HR teams should reassess their protocols and technology so they can identify these fake applicants.
Samanta Leonie, a Venezuela-based recruiter in the blockchain space, has observed these types of scams firsthand. She said her firm has likely been targeted because it deals with newer technology and because its entire workforce is remote. As a cover, many of these applicants claim to have previously worked for obscure nonfungible token platforms that can be difficult to get in touch with.
"We started to see an increase in the amount of people with these kinds of profiles around September 2021," Leonie said. "After that, I've seen at least one of these a week."
Filtering Out the Fraudsters
There are several ways that HR departments can identify fake applicants during the online interview process. While deepfake software can be convincing, it's not infallible. Brian Blauser, a supervisory special agent with the FBI, told HR Dive that something might feel "off" during the interview, such as lip movements and audio not matching up. Meanwhile, the Massachusetts Institute of Technology has created a website to train people to identify deepfakes.
Nevertheless, technology is always advancing. While deepfake Zoom calls might not be difficult to spot right now, there is a chance they could be indistinguishable from legitimate calls in the future.
Of course, not all scammers have access to the latest technology. Leonie noted that some fake applicants will refuse to appear on camera in interviews, which is a telltale sign. In these instances, a recruiter should listen for other people talking in the background, potentially advising the applicants.
Laura Mazzullo, owner of New York City-based recruitment firm East Side Staffing, recommends old-fashioned Internet sleuthing when checking out candidates.
"You can Google people's photos, and it's easy to look them up on LinkedIn," she said. "I have seen inconsistencies with resumes and LinkedIn profiles. A profile might say that you were with a company for two years, but your resume says you were there for three."
Tim Sackett, SHRM-SCP, a recruiting executive and president of the engineering staffing firm HRU Technical Resources in Lansing, Mich., recommends using interview technology such as Filtered.ai, which tracks IP addresses, makes job candidates appear on video for tech assessments, and doesn't allow copying and pasting of code.
"Filtered actually found one guy in New Jersey who was taking the tech assessment for dozens of candidates through IP tracking," he said.
Reference-checking software can also help, while other platforms can measure the attributes that a company desires in the candidate. Instead of asking general questions about an applicant, these platforms gauge how proficient the applicant is at specific tasks.
The biggest problem, however, is that so many employees involved in the hiring process just trust what people tell them. Employers need to be thorough and not give anyone the benefit of the doubt. "Our gut lies to us more than it tells us the truth," Sackett said.
Be Proactive
There will always be people who attempt to game the system, and remote working environments have provided a new avenue for opportunists. HR teams need to pay close attention when hiring employees virtually. These schemes will only evolve, and employers need to implement the best protocols and technologies to stop fraudsters in the act.
Andrew Deichler is a freelance writer based in Maryland.
Advertisement
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.
Advertisement