Not a Member? Get access to HR news and resources that you can trust.
The raw emotions of a polarized electorate are taking a toll on employee relations. How can HR promote peace?
Is your employee handbook ready for the New Year? With SHRM’s Employee Handbook Builder get peace of mind that your handbook is up-to-date.
Get the HR education you need without travel expenses or time out of the office.
Elevate Your Talent Strategy. Join us in Chicago, IL – April 24-26, 2017.
Analytics can both raise and mitigate legal risk. It’s all in the way you use them.
Big data is everywhere—including the workplace—and can now be routinely accessed from devices that can be carried in a pocket. Types of data can include hours worked, inventory tracking, measures of customer interactions, social media usage, and performance and productivity assessments.
Once translated into discrete data sets, the information harvested from myriad sources can offer useful and measurable insights into your workforce. It can shape nearly all aspects of managing current and prospective employees, including recruiting, hiring, scheduling, benchmarking, succession planning and determining security risks.
But while analytics can help HR professionals and business leaders make better decisions and cut costs, using them is not without legal risk.
"Employers are deriving considerable benefits from analytics," says Kate Bischoff, SHRM-SCP, an attorney and HR consultant at tHRive Law and Consulting LLC in Minneapolis.
"As attorneys, if we were to tell employers that they can’t use new technology because of the potential for risk, our clients would fire us," she says. "So I promise all my clients that I won’t say ‘no’ to new technology that has a legitimate return on investment. We just need to install a metaphorical seatbelt or airbag on it."
In other words, you have an attorney’s blessing to mine big data. But you need to understand the risks and how to mitigate them.
Among the most pressing concerns inherent in relying on big data is that improperly used HR analytics can result in employment discrimination.
Consider this scenario: An employer wants to beef up its sales force. To tailor its recruitment efforts, HR professionals and hiring managers seek to identify the qualities in the company’s current sales team that correspond to high sales returns. So they plug in all the information they have about the activities each individual representative engages in: the number of customer calls made versus e-mails sent, the frequency of interaction with the sales manager and perhaps even how often the customer lead database is accessed. They also input demographic information: birth date, race, gender. The data reveal a surprising finding: Being white, male and 32 to 37 years old correlates to strong sales numbers.
But correlation is not causation. Perhaps the vast majority of the sales team is white—making it unlikely to detect a meaningful relationship between sales results and other races—or maybe performance is related to a third "confounding" factor that also correlates to a certain age range or ethnicity. If HR professionals and hiring managers were to ignore these possibilities and take the data at face value, they would risk making unwise hiring decisions based on erroneous—and biased—assumptions.
More important, they would also be acting unlawfully by using age, gender and race as hiring criteria.
"If you’re trying to define what makes the best employee based on historical data you already have, you potentially re-create the homogeneity, which is not what you want," Bischoff says. Because a poorly conceived algorithm can produce discriminatory outcomes, it’s important to make sure you validate all algorithms before acting on them. Consider whether data inputs fairly correspond to desired traits or whether the use of certain data sets skews the analysis.
"We need to figure out how to use the valuable data we have and then work against the bias that we unconsciously have," she notes. "We have to ensure we don’t use patterns that can create potential disparate impact"—an adverse effect based on race, color, sex, national origin, religion, age or disability.
Analytics tools that collect personal health information can tread dangerously close to skirting the robust privacy provisions of statutes such as the Americans with Disabilities Act, the Genetic Information Nondiscrimination Act (GINA), and the Health Insurance Portability and Accountability Act.
"Performance data, resume data—all of that is OK," Bischoff says. However, using "anything that relates to a person’s health, banking information and their most secure personally identifiable information" raises concerns.
For example, a vendor that makes individualized wellness recommendations based on employees’ genetic information or that tracks fitness data on a corporate dashboard would put companies on shaky legal ground. These activities "clearly create potential liability under GINA and spark other privacy concerns as well," Bischoff asserts.
On the other hand, the vendor’s practices may offer attractive business benefits. Let’s say the vendor estimates that its tool can save employers $1,400 per employee per year.
To decide how to proceed, employers must conduct a serious cost-benefit analysis to decide if such tracking is worth the risk.
"There is a way to reduce risk and make it safer for employers," Bischoff says. "Have the vendor keep the information and never share it with you. Write it into the vendor contract that you are never to receive this information. Make it clear to employees that the data goes to third-party vendors and that you never handle any of it. Tell employees they are not to bring this Fitbit data to work."
[SHRM members-only article: SHRM Helps EEOC Explore HR's Use of Big Data]
Help Eliminate Bias
If used properly, analytics needn’t increase the risk of litigation. "Practitioners can reverse-engineer to constantly monitor against the algorithm to ensure it’s not creating a disparate impact," Bischoff notes.
In fact, big data can be harnessed to eliminate biases and minimize legal risk—and many employers are using it to that end. For example, HR professionals can aggregate employee data to identify hiring or pay disparities and then promptly act to correct them. In addition, they can call on data sets to disprove discrimination in cases in which employees do file suit. Using data, HR and business leaders can evaluate potential liability with mathematic precision, helping to inform decisions about whether to defend against a claim or to settle.
Handled well, big data can clearly translate to big success.
Lisa Milam-Perez, J.D., is a legal editor/senior writer-consultant for Wolters Kluwer Legal and Regulatory U.S. in Riverwoods, Ill.
Was this article useful? SHRM offers thousands of tools, templates and other exclusive member benefits, including compliance updates, sample policies, HR expert advice, education discounts, a growing online member community and much more. Join/Renew Now and let SHRM help you work smarter.
You have successfully saved this page as a bookmark.
Please confirm that you want to proceed with deleting bookmark.
You have successfully removed bookmark.
Please log in as a SHRM member before saving bookmarks.
Your session has expired. Please log in again before saving bookmarks.
Please purchase a SHRM membership before saving bookmarks.
An error has occurred
Recommended for you
Join SHRM's exclusive peer-to-peer social network
SHRM’s HR Vendor Directory contains over 3,200 companies