Researchers Say New Study Method Catches Resume Bias

Roy Maurer By Roy Maurer February 17, 2020
LIKE SAVE
job applicants wait on interview

​Two professors from the University of Pennsylvania's Wharton School in Philadelphia believe they've significantly improved a common method for testing resume bias.

In previous studies, researchers typically sent employers fake resumes and drew conclusions based on how the employers reacted to them. But this approach can be problematic because the information is phony, employers may not respond to unsolicited resumes and recruiters will likely get upset when they discover that their time has been wasted.

Business economics and public policy professors Judd Kessler and Corinne Low, along with doctoral student Colin Sullivan, tested their method, called incentivized resume rating (IRR), in cooperation with employers.

"Rather than putting the interests of firms and researchers in conflict, IRR combines those interests," Kessler said. "We didn't have to trick employers into doing this," Low added, contrasting the team's study with typical audit methods.

The study was conducted between 2016-2018 and involved 72 employers representing different sizes and industries. Kessler and Low gathered real resumes from nearly 800 recent graduates from the University of Pennsylvania and designed an IRR diagnostic software tool that collected varied experiences, academic degrees and other characteristics from those resumes. The software then reorganized the data to create thousands of fake resumes that clearly conveyed candidates' genders and ethnicities. Each participating recruiter—primarily female and white—knowingly reviewed and rated 40 randomly assigned fake resumes on how much they liked each fictitious candidate and the likelihood of that candidate accepting their job offer. The evaluations were then matched with 10 real job seekers based on the recruiters' expressed preferences—the main innovation of the incentivized study method.

"There's good reason for the recruiters to evaluate these hypothetical candidates carefully—their responses are used to match them with real candidates," Kessler said.

Working with employers also allowed Kessler and Low to collect data on what types of candidates were truly seen to be the most desirable, and not just strictly who received a follow-up call from recruiters.

That measure, on its own, is not a perfect measure of true candidate preference because employers may be reluctant to pursue candidates who will be unlikely to accept a position. "Call-back rates may conflate an employer's interest in a candidate with the employer's expectation that the candidate would accept a job if offered one," Low said.

For example, according to research on resume selection studies, employers tend to call back recently unemployed candidates more than candidates who currently have jobs, she said. "I doubt the employers really want unemployed candidates more. Rather, it's about who they think is available and isn't going to be a dead-end."

[SHRM members-only toolkit: Interviewing Candidates for Employment]

Study Results

The results of the study were illuminating. Overall, participating employers were not found to be more or less interested in female and minority candidates, but evidence of discrimination against white women and minority men was found among employers looking to hire candidates with science, technology, engineering and mathematics (STEM) majors, Kessler said. "In addition, employers report that white female candidates are less likely to accept job offers than their white male counterparts, suggesting a novel channel for discrimination."

Some of the highlights of the study include:

  • Companies hiring for STEM roles rated candidates with female and minority names lower than candidates with white male names. On average, a female or minority candidate with a 4.0 GPA received the same rating as a white man with a 3.75 GPA. On average, there was no disparity found in race and gender when employers were recruiting among students in the humanities, social sciences and business fields, leading the authors to attribute the results to unconscious, or implicit, bias when hiring for STEM roles.
  • Employers placed significant value on the quality of the internships candidates held prior to their senior year in college. Low noted that firms indicated that they would choose a candidate with a 3.6 GPA and a prestigious internship over a candidate with a 4.0 who didn't have that type of experience.
  • Female and minority candidates received less credit for prestigious internships in all fields. "It was quite a big effect," Low said. "Women and minorities only got about half the boost that a white man would have. One possible mechanism for this effect is that employers believe that other employers exhibit positive preferences for diversity, and so having a prestigious internship is a less strong signal of quality if one is from an underrepresented group."
  • Employers generally rated female and minority candidates lower in "get-ability," meaning they believed those candidates were less likely to accept a job offer. "Perhaps due to the prevalence of diversity initiatives, employers expect that desirable minority and female candidates will receive many offers from competing firms and thus will be less likely to accept any given offer," Low said. Or, employers may see female and minority candidates as less likely to fit in the culture of the firm, making these candidates less likely to accept an offer, she added.
  • Employers placed no value in students having low-skilled summer jobs during their senior-year summer. "Students got no credit for being a lifeguard or working as a barista or being a cashier even though those jobs could actually build some really useful experience," Low said. "That tells us that it might be particularly challenging for students who come from lower socioeconomic backgrounds and need to work to earn money in the summers to get these top jobs."

Kessler said that the study showed that the IRR method can benefit organizations interested in measuring and improving their hiring practices. "Employers could have their hiring managers use the diagnostic tool and then work with researchers to analyze their data to identify whether bias is present. But they could also go beyond this and use the tool to help correct any bias," he said.

And as for thinking that the emerging array of artificial intelligence and machine learning tools being deployed among recruiting technology stacks will solve the problem of human bias, Low said to be wary. "Firms need to remember that if you have some of these biases, they're going to get hard-wired into the algorithm. You have to think very carefully about how to strip that out."

LIKE SAVE

Job Finder

Find an HR Job Near You
Search Jobs

Are you a department of one?

Expand your toolbox with the tools and techniques needed to fix your organization’s unique needs.

Expand your toolbox with the tools and techniques needed to fix your organization’s unique needs.

REGISTER NOW

SPONSOR OFFERS

HR Daily Newsletter

News, trends and analysis, as well as breaking news alerts, to help HR professionals do their jobs better each business day.