When it comes to missing survey data, it is not a lost cause. You just need to know what to look for and then what to do with that information, or quite frankly, non-information in this case.
Before analyzing and reporting employee survey data, it’s essential to understand the extent to which data is missing at the question-, category-, and survey-level. Let’s focus on the latter: For data to be missing at the survey-level, an employee must not respond to even a single survey question. Response rate is the percentage of invited employees who answered at least one question and is the best metric to understand survey-level missing data. On average, large enterprises can expect a response rate between 72% and 88% for census surveys and between 55% and 81% for pulse surveys.
Ideally, survey-level responses are missing at random; that is, there is no clear pattern for who responded versus who didn’t respond. However, when there is a pattern in missing data, it can be a sign of response rate bias. Under such circumstances, overall results are non-representative because certain subgroups have varying response rates with varying favorability.
For one large healthcare system, our people analytics and insights team found an example of response rate bias. Specifically, black employees consistently averaged 12 percentage points lower response rates as compared to white employees across three annual census surveys, and this trend remained even after controlling for other factors such as job role.
They also found for those who responded, black employees were significantly less likely to be highly engaged as compared to white employees, and this “engagement gap” was widening over time. One must take into account such response rate and favorability differences in light of making system-wide and demographic comparisons. The team next analyzed potential solutions to increase response rates for black employees and discovered that receiving recognition (e-card) prior to survey launch increased the likelihood of responding by five percentage points. This healthcare customer is now placing increased focus on employee recognition to mitigate response rate bias.
What we really want to know about non-responders is how they would have responded to our survey questions. This is extremely difficult, if not impossible to predict, and we are left wondering about their reasons for not responding: Are they disengaged, apathetic, concerned about confidentiality, skeptical about how the business will use (or not use) responses, busy, out of the office, etc.? All is not lost with non-responders, however. Linkage analysis, which refers to tying employee survey data to business outcome data, can provide additional insights to make smarter business decisions.
All too often in linkage analysis, non-responders are wrongfully excluded from analysis. Analyzing turnover data for a large media customer one year after survey administration, our people analytics and insights team found they retained 91% of highly engaged employees, 85% of somewhat engaged employees, and only 71% of least engaged employees. This by itself told a compelling story: If all employees had been highly engaged, the company would have reduced its annual turnover rate by 36% from 13.6% to 8.7%, but nearly 1 in 5 employees didn’t respond to the census survey. The organization retained 85% of non-responders, identical to the largest group of employees who were somewhat engaged. Although one cannot make direct conclusions about engagement for non-responders, analyzing non-responders against data external to the survey, such as performance, sales, or turnover, can provide greater clarity about their likely levels of engagement.
Missing data is the rule, not the exception, in employee listening and people analytics research. And although missing data is frustrating to deal with, as it reduces your power and can bias your representativeness and insights, all hope is not lost, especially when analyzing non-responders against metrics outside of engagement.
Wondering how your survey response rates compare to industry averages, or how you can minimize missing data from non-responders? Read our earlier post on, “How to Encourage and Optimize Employee Survey Participation.”