Get access to the exclusive HR Resources you need to succeed in 2018!
SHRM board member David Windley discusses how unconscious bias can derail workplace diversity efforts.
Is your employee handbook keeping up with the changing world of work? With SHRM's Employee Handbook Builder get peace of mind that your handbook is up-to-date.
Build competencies, establish credibility and advance your career—while earning PDCs—at SHRM Seminars in 12 cities across the U.S. this spring.
#SHRM18 will expand your perspective – on your organization, on your career, and on the way you approach HR. Join us in Chicago June 17-20, 2018
Poor employee survey design may be stifling your response rate and limiting the effectiveness of this HR tool.
Increasingly, employers are surveying their workers in an effort to create more effective, productive workplaces. But many of these surveys may not be giving employers a true picture of employees’ experiences. As the use of surveys becomes more widespread, employers must focus on survey design and execution so their efforts will yield useful information that pinpoints problems and helps management address them.
Employers are using ineffective methods to survey employees, says Palmer Morrel-Samuels, former professor at the University of Michigan and president of EMPA Inc., a consulting firm based in Ann Arbor, Mich. The results can include shrinking response rates as well as useless information, making the exercise an expensive waste of time.
“Surveys as a resource are being inappropriately pressed into service to answer questions that are irrelevant, trivial or not suitable,” says Morrel-Samuels. He cites a survey that asked employees how they felt about the company’s cafeteria service. “Surveys should be used for making important business decisions and finding out what employees have to say. There are far more pressing issues than whether there are enough lunch trays,” he says. In fact, he adds, asking irrelevant or trivial questions “actually reduces response rates. It turns people off.”
Designing, administering and appropriately using the results of an employee survey is a complex process, but consultants and HR managers offer the following suggestions for achieving success.
Take a Step Back
Whether you need to tune up an existing survey or build a new survey from scratch, you should start by asking why you are doing the survey in the first place. “For some clients, it’s a big ‘aha’ just to step back and ask that question,” says Patricia Bayerlein, a consultant with Chicago-based Matha-Macdonald, a consulting firm that works with Fortune 500 companies. Determining the “why” can help you “ask the right questions so that you get something that’s prescriptive and not just descriptive,” she says.
For Roche Diagnostics Corp. in Indianapolis, a division of F. Hoffmann-La Roche of Basel, Switzerland, the focus of a recent employee survey was employee turnover. Roche’s HR managers noticed a spike in employee departures in 2000 during the height of the dot-com boom. “It seemed like an open door,” says HR manager Elizabeth Gruszczyk, SPHR. She and her colleagues decided to implement a survey to quantify employee loyalty.
Working with Indianapolis-based consultant Walker Information in 2000, Roche asked employees within the four business areas that were experiencing the most turnover about day-to-day satisfaction, sense of achievement, fairness at work, trust placed in employees and communication. Gruszczyk followed up that pilot with a survey of all 3,500 Roche workers in 2001.
As part of the survey process, HR worked with Walker to develop a “commitment index” that measures employees’ responses to statements about employee loyalty, such as “I have a strong personal attachment to Roche” and “When Roche has problems, I think of them as my problems.” The responses are scored on a scale of 1 to 5, with 4 or 5 considered a strong indicator of a committed workforce, Gruszczyk explains.
The results have been good, she says. In 2001, the employee commitment index was 3.6. In 2002, that figure jumped to 3.8, with individual organizations in the company hitting 4.0 or higher.
Sometimes a change in the organization necessitates a survey or a change in survey procedures. For example, AstenJohnson, a specialty textile manufacturer with U.S. headquarters in Charleston, S.C., had worked over the years with two consulting firms that developed employee surveys. But when AstenJohnson underwent a merger several years ago, the company switched to an in-house survey, says Jim Gray, SPHR, former vice president of HR.
AstenJohnson generally got good responses with surveys developed by outside consultants, but executives decided an in-house poll would better address merger-related issues, says Gray, a member of the Society for Human Resource Management’s (SHRM) Employee and Labor Relations Committee. The in-house version also gives the company the agility to revamp questions frequently and the flexibility to allow various operations to develop site-specific questions, both of which enable continuous feedback.
Keep It Short and Simple
Another factor that can influence a survey’s response rate centers on the kinds of questions asked. Most experts agree that including too many items—as survey questions are known in research circles—and including items that are confusing or repetitious can wreck a survey.
Instead, survey questions should be simple and short, using terminology familiar to all employees, says John Milatzo, research director for SHRM in Alexandria, Va. Milatzo helped develop the U.S. Postal Service’s first employee opinion survey in the 1980s while working there as an industrial psychologist.
A general employee survey should take 20 to 30 minutes to complete, experts say. “If the survey is too long, then the response rate will be very low,” says Morrel-Samuels.
Another risk of a long survey is a potential distortion of the answers. For example, if a survey seems to be taking too long to complete, the respondent may rush to finish, and the answers may be less reliable, Milatzo notes.
You should not ask employees to respond to “double-barreled” items—two topics that, while possibly related, should be considered separately. An example: “The pay and benefits are excellent at this company.” Employees’ responses may not yield useful information because they may think pay is great but benefits are not so good, or vice versa, leaving HR managers with no clear follow-up plan.
“If you don’t have good items, you don’t have actionable data,” says Van Latham, practice leader with PathPoint Consulting Inc. of Hopkinton, Mass., and a former HR executive with PepsiCo Inc.
Keeping it short can be critical in large manufacturing facilities, says Bayerlein. Production employees under pressure usually will choose getting a product out the door over filling out a lengthy survey, she notes.
Involving Employees in Design & Analysis
A number of companies use employee focus groups to help shape up their surveys before rolling them out companywide. This process can help survey designers identify items that don’t work.
At the Postal Service, “we were starting from scratch,” Milatzo says. Focus groups of targeted employees helped him trim down a potentially bulky survey.
Focus groups also can help once a survey is completed, says Latham. If you have good processes for examining and using the data collected, you can still get good information from a weak set of survey items, he says. That may require convening employee focus groups after the survey to get to the heart of the issue.
Ask the Right Questions
Morrel-Samuels advocates the use of items that seek responses based on a numerical scale, such as 1 to 5, with 1 meaning “strongly disagree” and 5 meaning “strongly agree.” Survey results using numbered scales are much easier to analyze than those using words alone, he says. “The difference between 3 and 4 will always be 1. But the difference between ‘somewhat satisfied’ and ‘fairly satisfied’ is not only hard to quantify, it’s hard to analyze.”
Some experts recommend asking primarily closed questions—those with a finite number of answers—instead of a fill-in-the-blanks approach.
Problems with open-ended responses include the volume of data generated and the difficulty in grouping and analyzing it. A survey dominated by open-ended questions would probably have been useless at a company the size of PepsiCo, Latham says. But this type of survey works fine for a client of his who has only 40 employees “because we can get our arms around the data.”
AstenJohnson’s survey includes a number of open-ended questions. Gray says a standard survey question for several years has been “Do you feel your situation is better today than it was two years ago? If not, why not?” While the information may be difficult to manage, the company looks for common issues to address. Gray says the company considers this approach useful. “The more comments we get, the more active our people are in communicating. Comments are good,” he says. “No feedback means there’s probably something terribly wrong.”
Roche’s survey includes primarily closed items, but it provides an opportunity at the end of the survey for employees to comment on whether something at the company should be done differently. Gruszczyk says this question yielded about 200 individual responses last year. The information is analyzed and managed at the business unit level.
Questions to Limit or Avoid
Another crucial issue in getting employees to respond year after year is to ask questions that will yield answers that management can act on. Employers should be willing to do away with, or at least limit, “nice-to-know” questions, including some demographic information.
Morrel-Samuels notes many employers as a matter of course ask for such information “to get as rich a description of the organization as possible.” But he urges employers to use caution in asking for demographic data, such as gender, race or age.
“Demographic questions should be restricted to those questions where the employer can actually do something about the results,” Morrel-Samuels argues. Employees could interpret questions about race or gender as an indication that the employer plans to initiate specific programs targeting those populations, he explains.
However, given the focus on diversity and understanding the differences in how men and women look at the workplace, as well as issues faced by older workers, Latham says, you should ask demographic questions and analyze the data for insights into group concerns and trends.
Keep in mind that asking for demographic information may raise fears about anonymity, which could lower your response rates.
Over the past two years, for example, Roche has tweaked the demographic information it solicits from employees to organize survey responses based on where the employees fit into the business organization, Gruszczyk says. During the first year the survey was administered, there was confusion among some employees when they were asked to report where they belonged, because Roche’s business is complex.
Gruszczyk and her team tried to address this by being more specific in a recent survey, but that caused a new problem.
“We probably went too deep on the demographics this year, and probably made some people a little hesitant to respond” because they feared loss of confidentiality, she says.
Morrel-Samuels also advocates designing surveys to address directly observable behavior. Asking employees to respond to statements about personalities, thoughts or attitudes is an expensive waste of time, he says.
The statement “My boss respects me as an individual” is open to interpretation. Morrel-Samuels says a better item would be “My boss humiliates me in public,” because the item is grounded in a specific event. “If a survey is designed to address directly observable behavior, there are clear grounds for discussion and clear paths for remedial action,” he states.
Surveys also should be sprinkled with negative statements, says Morrel-Samuels. If a survey is filled with positive statements such as “My boss is considerate” or “My team is helpful,” the results can be unrealistically rosy, he notes.
Milatzo also advocates the use of specific questions, i.e., those that ask about observable behavior, to avoid the “social desirability” syndrome, the tendency to give all positive responses to please the inquirer.
While designing a good survey is key to getting useful information, doing something with that information afterward is just as important. Companies should communicate results to employees as soon as possible, along with any plans to implement changes.
Both Gruszczyk, at Roche, and Gray, formerly with AstenJohnson, say their surveys have helped management underline the link between employee satisfaction and customer satisfaction.
Gruszczyk says she and her colleagues feel they’ve demonstrated their employee loyalty survey “isn’t an HR exercise. This is a business response to some things that we needed to address.” This has led to management support throughout the company, she says. “Those loyal behaviors are going to show up when [employees are] dealing with our customers, and that’s going to increase your business results.”
But follow-up action is key. “If you don’t have the time or the resources to give feedback to the employees about the results, and follow up and do something about it, you’re almost better off not doing the survey, because it raises expectations,” says Gruszczyk.
Latham agrees. “You can have a beautifully designed survey, but if you don’t do anything with the data, then you’re wasting everybody’s time,” he says.
Charlotte Garvey is a freelance writer, based in the Washington, D.C., area, who reports on business and environmental issues.
You have successfully saved this page as a bookmark.
Please confirm that you want to proceed with deleting bookmark.
You have successfully removed bookmark.
Please log in as a SHRM member before saving bookmarks.
Your session has expired. Please log in again before saving bookmarks.
Please purchase a SHRM membership before saving bookmarks.
An error has occurred
Recommended for you
Join SHRM's exclusive peer-to-peer social network
SHRM’s HR Vendor Directory contains over 3,200 companies