Share

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus convallis sem tellus, vitae egestas felis vestibule ut.

Error message details.

Reuse Permissions

Request permission to republish or redistribute SHRM content and materials.

Predicting Misconduct: How AI Helps Head Off Harassment, Bias and Other Ethics Issues

Artificial intelligence, machine learning and natural language processing technology have added new dimensions to digital misconduct reporting platforms.


An illustration of a hand with icons on it.


It wasn’t long ago that misconduct reporting systems within organizations underwent a digital makeover. Yet another one is already on the way. 

In recent years, legacy 800-number hotlines, creaky case management systems and open-door policies had given way to mobile reporting apps and secure Web-based portals; chat functions that allow workers to ask questions before filing complaints; and new collaboration tools that help the HR, legal and compliance functions work together more seamlessly on investigating misconduct cases.

These new resources were designed to create more modern and secure reporting channels that would encourage employees fearful of retaliation—or of not being taken seriously—to report incidents of harassment, discrimination or other ethics abuses while providing faster, more-efficient ways to address such claims. 

Now, the arrival of artificial intelligence, machine learning and natural language processing technology has added a new dimension to digital reporting platforms. These tools allow HR and other leaders to be more proactive in identifying areas where misconduct or ethics offenses may be on the rise and to address them before they mushroom into lawsuits, scandals or unwanted news headlines.

Remote Work’s Impact on Misconduct

Common wisdom held that incidents of sexual harassment and discrimination would diminish as employees moved out of their offices and into their homes when the pandemic hit. Yet the opposite sometimes happened. Studies show that some remote workers, feeling emboldened by the absence of in-person contact and the lack of witnesses to their behavior, sent harassing or abusive texts or e-mails to colleagues.

A 2021 survey from TalentLMS and The Purple Campaign found that more than one-quarter of employees have experienced unwelcome sexual behavior online via Zoom or Google Hangouts, text messages, e-mails, or internal chats since the start of the COVID-19 pandemic.

Additionally, in Deloitte’s 2021 Women at Work report, 52 percent of women surveyed said they had experienced some form of harassment or noninclusive behavior, including disparaging or sexually explicit comments, while at work in the past year. 

Providers of third-party incident reporting platforms have added new features and tools to their systems to help organizations be more proactive in identifying and addressing both new and age-old forms of workplace misconduct. London-based Vault Platform’s incident reporting and investigation technology now uses AI to help identify patterns of harassment or discrimination, as well as abuses such as fraud and safety violations, within the organization. 

Proactive Technology

The new technology builds on Vault Platform’s unique GoTogether incident reporting feature, which uses the strength-in-numbers concept to encourage employees to report incidents of harassment or discrimination as part of a group. 

With the GoTogether tool, employee complaints are moved to the next stage of investigation only when multiple complaints are filed independently against the same person. This system lets employees who have reported incidents know they’re not alone and helps HR leaders identify recurring issues in the company. 

Vault Platform added AI to that process to enable leaders to spot “hot pockets” in internal company communications—not just from individuals but among entire work teams or departments—and identify situations where a toxic environment or fraudulent activity is being reported in disparate or remote parts of an organization. 

“Proactive is the name of the game with this technology,” says Neta Meidav, co-founder and CEO of Vault Platform. “It helps companies identify repeated patterns of abuse and intervene before they develop into bigger problems.”

Neta Meidav.png





Proactive is the name of the game with this technology.’
NETA MEIDAV


Conducting Improved Investigations

Because third-party misconduct reporting platforms ease the burden of managing investigations, they are likely to appeal to HR leaders whose often-overstretched departments already have enough to handle.

Fogo de Chao, a chain of steakhouse restaurants based in Plano, Texas, chose to partner with third-party provider Work Shield to help manage and investigate complaints of harassment and discrimination. 

“From a resource standpoint, we didn’t have enough HR staff to handle internal investigations in the manner we’d like,” says Peter Bruni, senior director of total rewards for Fogo de Chao. “Partnering with a company like Work Shield led to improved intake of complaints as well as faster resolution of incident investigations. From intake to resolution, we’re now averaging less than six days, even for complex cases.”

That increased speed comes from having dedicated outside counsel assigned to cases and access to new technology tools, Bruni says. He adds that employees who report incidents have welcomed the shortened time frame. “We feel more confident our employees are having their issues heard and addressed quickly now,” he says.

New features on next-generation technology platforms benefit the reporting and investigation process in multiple ways, Bruni adds. For example, Fogo de Chao employees can now enter incident reports directly into a secure portal provided by Work Shield through one of three channels: a digital ID card; their smartphones, tablets or computers; or a call center staffed by people they can talk to. 

Reluctance to Report

Despite the technological advances, convincing employees to report workplace harassment or discrimination remains the biggest obstacle in the process. Experts say that’s because people often feel they’ll be retaliated against, ostracized or fired. 

“We know from EEOC [Equal Employment Opportunity Commission] data that 75 percent of employees say they didn’t report harassment they experienced at work,” says Claire Schmidt, founder and CEO of Los Angeles-based AllVoices, a technology platform and case management tool that allows employees to report harassment, bias and other culture-related issues. “The first step is making sure everyone who experiences harassment or discrimination is telling their company about it. If you aren’t aware, you can’t take action.”

AllVoices’ research found that fear of reporting extends beyond harassment and discrimination issues. The data shows that almost one-quarter of employees don’t feel comfortable voicing COVID-19-related safety concerns because they either fear retaliation or don’t think their concerns will be addressed.

Experts say that while technology can add efficiency and speed to the reporting and investigation process, maintaining the human component is critical. 

“No one wants to talk to a chatbot about a scary topic like being harassed or discriminated against,” says Jared Pope, founder and CEO of Work Shield. “In most cases, they want another human being to hear their voice when they report. The moment you remove the human voice from the process, you begin to lose your culture.”


Neta Meidav.png






‘No one wants to talk to a chatbot about a scary topic like being harassed or discriminated against.'
JARED POPE

Providing Better Communication

Understanding where technology can enhance the process and where humans should still play a central role is crucial to creating successful misconduct reporting systems. One area where technological innovations can add value is in communicating during the investigation process.

Work Shield’s platform provides automatic notifications at all stages of the investigation so stakeholders—HR, employees reporting incidents, legal—can stay apprised of progress in real time.

AllVoices is testing the use of AI on its platform and has added new features such as an investigation support tool for HR leaders and a two-way communication channel for employees who are considering reporting incidents.

“The support tool walks leaders step by step through the process of conducting an investigation, making sure no critical steps are missed and all relevant data is both stored securely in the cloud and easy to find,” Schmidt says. 

The two-way communication channel allows employees to stay anonymous but receive more information about the investigation process before formally committing to it.

“It’s a chat function between the employee and the company that’s designed to provide more comfort and information to the employee about how the investigation process would look,” Schmidt says. “People understand there is no way to come forward partway. Once you sit down in someone’s office and speak, you are committed. Allowing employees to ask questions in advance provides reassurance.”

Advanced Analytics Identify ‘Hot Spots’

Many next-generation reporting platforms also boast improved analytics tools, allowing HR leaders to more easily identify pockets of the company where misconduct or internal ethics concerns are more prevalent.

Analytics generated by Vault Platform can help HR leaders identify problem areas and “connect the dots” to reveal any repeated patterns of misconduct in the organization, Meidav says. “When you know you have a pattern of problems in certain areas, it allows you to target training and tailor remedial actions to those spots,” she says. “It becomes your map of the company’s ethical and cultural health.”

Dave Zielinski is a freelance business journalist in Minneapolis.

Illustration by Michael Korfhage for HR Magazine.


Complying with Whistleblower Laws

Human resource leaders who use third-party misconduct reporting platforms need to ensure those technologies comply with existing—and new—whistleblower regulations in both the United States and the European Union (EU).

In the U.S., platforms should conform with the whistleblower protection provision of the Sarbanes-Oxley Act (SOX), which prohibits a range of retaliatory actions against those who speak up about misconduct or ethics concerns.

Experts say reporting platforms used by companies operating in the U.S. should have SOX-compliant, encrypted whistleblower hotlines. Claire Schmidt, founder and CEO of reporting platform AllVoices, says that in addition to offering a Web-based platform, her company provides an option for a call-in line that guarantees anonymity.

 “More organizations are seeing the importance of having an anonymous channel for employees to speak up, not just about workplace harassment and discrimination but a variety of internal ethics issues like financial compliance, fraud and more,” Schmidt says. 

Companies with 250 or more employees who do business in the EU also will need to comply with the new EU Whistleblower Protection Directive by December 2021. Smaller organizations face a 2023 deadline. The directive requires organizations to establish secure internal channels for reporting violations of EU laws, and it protects from retaliation current employees, former employees and prospective employees such as job applicants, as well as contractors and unpaid volunteers. 

Under the EU directive, internal reporting channels should allow whistleblowers to submit reports either in writing or orally, and the identity of the whistleblower must be kept anonymous. —D.Z.

Advertisement

​An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.

Advertisement