Employee Training on Deepfakes Should Become Standard Practice

Allen Smith, J.D. By Allen Smith, J.D. August 8, 2019
Employee Training on Deepfakes Should Become Standard Practice

​Staff training on how to spot deepfakes—manipulated audio and video made to look as if it documents events that really happened—should become as standard as training on how to avoid e-mail phishing scams, experts say.

Here's an example of a successful deepfake attack: A worker responsible for financial transactions receives a call, apparently from the CEO, to transfer large sums into what turns out to be a cybercriminal's account. The voice on the call sounds like the CEO's but perhaps with some electronic noise, which is usually explained as background noise, said Saurabh Shintre, senior principal researcher with Symantec in San Francisco.

"Due to the interactive nature of this call, the employee trusts the authenticity of it and fulfills the request made by the attacker," Shintre said. The worker also is pressured to fulfill the request immediately, which Shintre described as "a classic scam technique."

For now, deepfake audio poses the most risk to companies, said Matt Price, principal research engineer at ZeroFOX in Baltimore. "Longer term, deepfake video is likely to pose the greater danger," he said.

Who Is Vulnerable to Deepfakes?

Corporate executives and other well-known individuals are likely targets of deepfake videos, noted Aaron Crews, chief data analytics officer with Littler in Sacramento, Calif.

"Because they live in the public eye, and there is therefore a significant amount of audio and video content of them out on the Web, there is ample material out there that can be used to create deepfakes," he said.

Women are particularly vulnerable to deepfakes, too, Price said. Several apps, including DeepNude, have been created that seemingly "undress" women, which could lead to serious reputational damage. DeepNude swaps clothing in a picture of a dressed woman with naked imagery, according to Vice.

Anyone with a significant online presence is vulnerable to deepfakes, Crews added. "If you're a person with a robust social media presence, a large number of speaking gigs that have been recorded and published over the years, or someone who runs an audio or video blog, for example, you have likely generated enough data such that you are at risk of being the victim of a deepfake," he said.

Potential Harm

Deepfakes may impact the workplace in several ways, Price noted. These include:

  • Company or employee reputational damage. For example, someone could make a deepfake of a customer's bad interaction with a company employee, or a C-suite executive could be shown saying or doing something inappropriate or illegal.
  • Loss of trust. Deepfakes could be used to destroy the trust customers have in a company.
  • Manipulation of stock prices through the release of false company information. Someone could release the day before an initial public offering a deepfake of a company's CEO saying that the company has a product with a major defect.
  • Scams. Cybercriminals are expected to increasingly adopt deepfake techniques not only to fool workers into transferring money into the criminals' accounts, but also to gain access to a company's internal systems or trick employees into giving up personal or company information.
  • Blackmail. Criminals could try to extract money from firms by threatening to release a deepfake that would substantially hurt a firm's reputation.

Detection Can Be Difficult

Deepfake video is easier to detect than deepfake audio, according to Shintre. "Even deepfake audio snippets generated in a small amount of time often consist of artifacts not found in real audio and can be detected with some attention," he said. "That is why such attacks often require the employee to make a quick decision, clouding his or her judgment."

Detection researchers often will release methods to detect deepfakes that the deepfake creation community then quickly counters, Price said. In one instance, a detection technique that focused on the number of times a person blinked in a video had a high rate of success in detecting deepfakes, Price said. But just two weeks later, that technique no longer worked.

Although deepfake detection technology is improving, deepfakes are becoming easier to create and more widespread, increasing their potential for harm, according to Natalie Pierce, co-chair of the robotics, AI and automation practice group with Littler in San Francisco.

[SHRM members-only platform: SHRM Connect]

Employee Training Needed

 "We are entering a new era where employees and firms will need to be extra vigilant in their interactions," Price said. "If something seems off or not right, extra caution and verification should be exercised."

Employees must understand that seeing or hearing is no longer believing, Pierce said. "In addition to educating the workforce to the existence and use of deepfakes, a company must be prepared to train on proper response protocols." Those protocols should include telling employees whom to call when they encounter questionable content and how the company will quickly analyze it.

While a company can try to cryptographically "watermark" audio and video recordings, or take other steps to try to authenticate recordings of individuals' likenesses or voices, this isn't easy or particularly effective, Crews said.

"Monitoring a firm's online presence will become increasingly important as deepfakes continue to proliferate," Price said. "Being able to quickly identify and take down false content online will help firms to minimize the risks that deepfakes pose."

More Ramifications

HR needs to rethink what it means to undertake due diligence when investigating or taking adverse actions against employees in the workplace based on a recording or a photo, Crews noted.

"It is now entirely plausible for a person seemingly caught doing or saying something problematic in a recording—audio or video—or photo to claim that they did not, in fact, do or say the things depicted," he said. "There is also the potential for fakes to emerge as evidence in a lawsuit or administrative action against an employer."



Hire the best HR talent or advance your own career.

Are you a department of one?

Expand your toolbox with the tools and techniques needed to fix your organization’s unique needs.

Expand your toolbox with the tools and techniques needed to fix your organization’s unique needs.



HR Daily Newsletter

News, trends and analysis, as well as breaking news alerts, to help HR professionals do their jobs better each business day.