EEOC Panel Looks at Implications of Big Data in Workplace

Kathy Gurchiek By Kathy Gurchiek October 17, 2016
EEOC Panel Looks at Implications of Big Data in Workplace

​The Equal Employment Opportunity Commission (EEOC) heard from Society for Human Resource Management member Eric Dunleavy, Ph.D., on Thursday during a public meeting in Washington, D.C., on the use of big data in the workplace.

Big data uses algorithms, "data scraping" of the Internet, and other means of evaluating tens of thousands of pieces of information about a person and is being used to make hiring and other employment decisions. It is an issue SHRM Online has previously reported on.

Dunleavy is director of the personnel selection and litigation support services group for DCI Consulting Group Inc. in Washington, D.C. He noted he is most familiar with big data tools used for recruitment and hiring. He was among a panel of experts appearing before the EEOC.

They discussed big data trends and technologies, the benefits and risks of big data analytics, current and potential uses of big data in employment, and how using big data may run afoul of equal employment opportunity laws. 

"HR has been successfully using data and HR metrics for decades," Dunleavy said in prepared comments submitted to the EEOC. "The ability to interpret information with which to make business decisions and recommendations is key to informing many workforce questions. SHRM HR certification materials point out that 'analytics have the potential to improve individual and organizational performance'" by:

  • Embedding workforce intelligence as a cornerstone in management decision making. This refers to using workforce data about employee preferences, habits, behaviors, and skills in management decision making.

  • Improving workforce planning and forecasting.

  • Shortening recruiting cycles.

  • Reducing recruiting and separation costs.

  • Retaining critical talent.

  • Driving succession planning.

  • Using on-demand insights to avoid costly mistakes regarding the workforce. These allow for real-time monitoring of important issues such as workload, time spent on tasks, and other job data.

  • Redirecting money on employee initiatives for more beneficial uses or programs. Data could show, for example, HR may be spending money on food at a monthly all-staff meeting when what employees really want are more comfortable chairs.

He acknowledged that little is known regarding how widely various big data practices are being used but pointed to a 2016 SHRM Foundation report that cited an Economist Intelligence Unit survey on the growing use of big data. The 2015 survey found that 82 percent of organizations plan to begin or increase their use of big data in HR over the next three years.

Dunleavy was among a panel of industrial psychologists, attorneys and labor economists discussing the use, and possible unintended consequences, of big data.

Panelists included:

  • Kelly Trindel, Ph.D., chief analyst, office of research, information and planning, EEOC.

  • Michael Housman, workforce scientist-in-residence, hiQ Labs, San Francisco.

  • Michal Kosinski, assistant professor, organizational behavior, Stanford Graduate School of Business (via remote connection), San Francisco Bay area.

  • Marko J. Mrkonich, shareholder, Littler Mendelson P.C., in the greater Minneapolis-St. Paul area.

  • Ifeoma Ajunwa, assistant professor, University of the District of Columbia School of Law.

  • Kathleen K. Lundquist, Ph.D., organizational psychologist, president and CEO, APTMetrics, Inc., a global HR consultancy in the greater New York City area.

Lundquist characterized big data—also known as predictive or talent analytics—as a "harvesting of a wide range of empirical data for HR decision-making" that presents a future that she called both promising and scary.

"Algorithms used for recruiting often include data obtained by searching publicly available databases where the accuracy or completeness of the data may be questionable, leading to more missing and incorrect data in the selection process," she pointed out as one example in her prepared statement to the EEOC.

She expressed concern that an algorithm could match people characteristics rather than job skills or requirements. A high-performing group, for example, may not be diverse and using characteristics of that group to structure the algorithm "may more reflect their demographics than the skills or abilities needed to perform the job," she said in her prepared statement.

Others raised the possibility that big data could invade employee privacy, or run afoul of antidiscrimination regulations.

 "It's not always foreseeable that a certain algorithm will access and deploy prohibited information in its decision-making processes," Ajunwa said. "[We] must remain alert for the potential for it to be used in ways that essentially violate the spirit of the law of antidiscrimination and in ways that could permanently erode worker privacy."

She urged putting safeguards in place so that the use of big data "doesn't become a shield for covert discrimination."

Describing himself as someone speaking "from the trenches, working with employers who spend millions of dollars to comply with [EEOC regulations]," Mrkonich said he sees big data making positive differences in employment.

"It expands the applicant pool beyond those who even apply, limiting the application barrier," he said. "Big data, used correctly, eliminates discrimination in many of its most egregious forms," he pointed out.

"For other employers, it's a way to eliminate the bias in the [hiring] decision process," he said. The anonymous nature of the data "makes it very safe and secure to use," he added.

Kosinski concurred.

Such tools "need to be used properly and be subject to the same principles of fair, valid and accurate assessment," he said. Kosinski encouraged the creation of guidelines for developing computational models that are accurate and free from bias.

Lindquist suggested the following best practices for employers:

  • Validate the predictive models' accuracy over time and with different employee segments.

  • Conduct a job analysis to ensure the algorithm is measuring the knowledge, skills and abilities related to the job performance, rather than reflecting the demographic characteristics of current employees.

  • Examine the representativeness of the populations included and the accuracy and fairness of the data inputs on which the algorithm is based to ensure that all relevant data are both correct and inclusive.

  • Train and support managers on how to interpret and use these solutions to make decisions, and inform candidates about the use of information to avoid privacy concerns.

Commissioners and panelists agreed on the need for education in the creation of algorithms to avoid disparate impact to women and others, such as people with disabilities, and the importance of including human resource professionals in the conversation.

"We need to give thought to what role does law play in this, given some of the more potentially complex issues on disparate impact," EEOC commissioner Chai R. Feldblum said. The EEOC is creating a working group, headed by Trindel, to study the issue.

The public may submit comments on this topic to the EEOC through Oct. 28. Comments may be mailed to Commission Meeting, EEOC Executive Officer, 131 M Street, N.E., Washington, D.C. 20507, or e-mailed to:



Hire the best HR talent or advance your own career.


HR Daily Newsletter

News, trends and analysis, as well as breaking news alerts, to help HR professionals do their jobs better each business day.