Benchmark your B2B survey response
Within my account group at ICR, about 40% of my research portfolio is directed toward B2B audiences; that is, business owners and CEOs, IT directors, middle managers, and other "daytime" respondent sets. The other 60% are typically consumer audiences, contacted (if by telephone) at nighttime and on weekends.
Over the past few years, my once limited exposure to B2B research studies via the Internet has blossomed into quite a robust research function. Recently, I was working with a client of mine in the financial services industry, fielding a web-based satisfaction survey among their clients in the retirement plan services sector. The client asked, "Greg, it's been over 24 hours since our e-mail invitations went out, and we've gotten about a 9% response rate so far. How does this stack up against other studies' response rates, after the first day of activity?"
My reply noted that survey research response rates are highly variable, so there are no hard and fast rules about benchmarking them. For example, our client on this occasion had mailed its contact list a pre-notification letter by U.S. Mail about a week before the e-mail invitation went out. That certainly reduced the e-mail deletion rate, helping the response rate. Not every client would take that important step, thinking that 39 cents is too much to pay to increase response.
The retirement plan study included 950 outbound e-mails to the customer-side decision-makers and day-to-day managers of the plans. Sixty-nine of them were returned immediately as undeliverable. Thus, the net sample included 881 records.
We felt that these response rates were pretty much in line with other studies we've conducted by web with B2B respondents.
For example, we had another study of casualty insurance underwriters who are members of a professional organization. Their 7,100 invites went out on a Monday. After 24 hours, we had about 470 completes; the second day saw another 340 interviews completed; then it dropped off to about 80 or 90 completes on Wednesday. After four days, 949 surveys had been completed on that sample -– 13% overall, and 14% if you discounted the undeliverable and "out of office" e-mail invitations.
One other study we did recently was with hotel managers who used our client’s transaction processing software and registers. There were only 107 pieces of sample, but they were very heavily encouraged to respond by the hotel chain’s senior management and by the software client. There were also reminder notices sent twice after the initial e-mails. There were 9 respondents in the first 24 hours (8%), and then they trickled in (no more than 3 completes in any given day) for the rest of the 3-week field period, with the reminders going out repeatedly. The final tally was 39 completes, or a 36% overall response rate.
Our cohorts over at the e-Rewards Business panel say that when drawing up a cost estimate, they will estimate a 10%-15% overall response rate on their B2B web surveys, and that’s using a self-selected panel of participants!
Thus, I would say that our retirement services client's response rates so far are right in line with other B2B client surveys that we have conducted on the web. I'd be curious to learn what others in the industry tend to experience, so leave a Comment!
Tags: B2B, B2B surveys, response rates, surveys, web surveys