Effective Practices for Increased Survey Participation

There are several practical methods that can raise survey response rates by increasing participant “buy-in,” streamlining survey design and implementation, and formulating cost-effective survey participation incentives.


  • Survey participation and response rates have been declining for years.

  • To improve response rates, most researchers and commentators agree on the general suggestions of (1) shortening the survey, (2) increasing participant “buy-in” to the survey’s purpose, and (3) providing carefully crafted participant incentives.

  • Businesses can and should consider leveraging certain technologies and techniques to streamline survey design and administration, personalize content to targeted subgroups of customers, and formulate cost-effective incentive programs.

It is common knowledge that businesses frequently use customer surveys to gauge and enhance customer experience and, ultimately, profitability. Some empirical studies have concluded that it is five to ten times more profitable to sell to existing customers than to market to new ones, and that a mere 1% increase in customer satisfaction can translate into a 20% increase in profitability. Studies have also found that the converse is true -- bad customer experiences (and the dissatisfied customers they produce) can decrease profitability in a variety of ways. That is why customer feedback is so important to a company’s success.

Obtaining customer feedback through surveys is an obvious, but deceptively complicated, endeavor. Designing a quality survey with well-crafted, insightful questions is only the start. The other challenge is to maximize the number of customers who participate in (and complete) the survey. Maximizing this participant “response rate” ensures a statistically appropriate sample size and thus a more reliable, less biased result. Poor response rates can create statistical biases so great that they invalidate the survey, meaning that it will not provide reliable, actionable customer feedback.

An increasing problem in recent years is the marked decline in customer survey response rates. Busier work/life schedules, information overload (from junk mail, spam and other requests for attention), as well as the fear of disguised “sales pitches” all contribute to this issue. In order to combat declining survey response rates, researchers and commentators often suggest obvious goals like: (1) communicating a sense of common purpose with survey participants, (2) keeping surveys short and simple, and (3) providing survey participation incentives. Unfortunately, these same researchers often avoid providing specific suggestions about “how” to accomplish those laudable goals. This whitepaper discusses some specific, practical tools that can be used to increase participation rates through increased “buy-in,” streamlined survey design, and development of effective survey participation incentive programs.

One often cited cause of low survey response rates is the fact that participants don’t typically know why they are being asked to share their feedback. Customers do not respond to survey invitations because they are not given an answer to the age-old question of “What’s in it for me?” While companies typically seek data-driven customer feedback to achieve higher profitability, survey participants do not always understand the how the survey will benefit them. Ironically, consumers are not always informed that by helping a business through survey participation, they are directly helping improve their own experience.

There are several easy ways to obtain survey participant “buy-in.” The most obvious are informing the survey participants about the purpose and goals of the survey, accurately framing how much effort survey participation will take, and explaining how the participant will benefit. Research suggests that consumers are often willing to help a business improve its customer experience by participating in a survey, simply because they clearly understand its purpose. A clear message in the invitation to participate (whether presented in person or online) about the specific purpose and goals of the survey can, by itself, meaningfully increase response rates.

In addition, studies have shown that participation rates in customer surveys can be increased by simply offering to share the findings of the survey upon its conclusion. Allowing limited access to the results of the survey, updating participants on the changes implemented because of the survey, or both can produce a sense of “common purpose” among the business and responding customers. Under this approach, responding consumers are more likely to perceive the survey process like voting - it gives them a voice to make a positive difference - rather than as just another distraction requesting a consumer’s time and attention with no perceived personal benefit.

In a perfect world, customer surveys could be extensive and provide a wide array of in-depth insights about every aspect of the customer experience. Unfortunately, research proves that longer surveys significantly increase both non-participation rates and abandonment rates among participants. This places survey questionnaire brevity and simplicity at a premium. When it comes to effective survey design, one is reminded of Mark Twain’s famous quip, “If I had more time, I would have written a shorter letter.” Designing and administering a succinct, simple and effective survey questionnaire is key to increasing participation rates, but is also a difficult and time-consuming task.

Fortunately, in the era of “big data,” it has become easier to mine important customer information without burdening a survey participant with basic “who, what, where, and when” questions.

A highly effective tool to shorten survey length is integrating “point of sale” (POS) data within the survey process. These analytical systems marry point of sale data with the critical questions that form the core purpose of the survey itself. The benefit is relieving the participants from the burden of answering basic contextual questions. For example, point of sale data (rather than survey participants) can answer basic contextual question about the date and time of the customer’s interaction, the method and amount paid for a good or service, and even the identity of the specific employee facilitating the transaction. This allows the survey questionnaire to focus attention on only those questions key to improving the customer experience.

Further, the use of POS data to augment traditional surveys can also streamline the process of administering the survey. A point of sale survey invitation can (and should) be carefully designed and consistently delivered to facilitate customer “buy-in.” POS integrated survey systems can allow the automation of survey administration, ensuring consistent messaging during the invitation phase. Again, several studies clearly indicate that response rates are increased by simply including a clear message during the survey invitation conveying the purpose of the survey, how long (or, better yet, short) it will take the participant to complete it, and explaining the “value proposition” to the customer of participating in the survey.

Researchers have concluded that the timeliness of survey administration is another key driver of increased response rates. The decline in response rates during the two-week period following a customer’s interaction, as demonstrated by various studies, is staggering. POS integrated surveys can be conducted during an optimized range of times proximate to the customer’s experience, and through a variety of different media. For example, POS integrated surveys can be taken onsite at the time the customer transaction is concluded (using an iPad, for example), at a time chosen by the customer through a web-based survey link on a receipt (using, as is increasingly common, a mobile device), or even later through an e-mailed survey, if preferable (assuming an email address was provided by the customer).

Another advantage is that proper analysis and integration of POS data with customer surveys can lead to an increased level of survey “personalization.” With proper systems, POS data can be quickly and effectively analyzed to delineate specific target groups (e.g. only people that purchased a certain product) or exclude certain groups (e.g. prior survey participants). Targeting carefully customized, well-designed questions to specific sub-groups of customers can significantly boost response rates and data quality. This is due, in part, to the concept of “salience,” where participants perceive survey questions as more important or more timely to their experience. On the other hand, if a survey participant does not perceive the contents of a survey as being germane to them and their experience, they are far less likely to participate in and complete the survey. In addition to providing data used to improve future experiences, leveraging POS data in customer surveys can allow a company to quickly “flag” poor customer experiences and route them through a more customized complaint mitigation process.

It is not news that participation incentives increase survey response rates. There are plenty of experiments, studies and research projects on the subject. Most studies indicate incentives can increase response rates by over 15%, assuming your target participants perceive value in the incentive offered. Commentators generally break incentives down into two primary types, financial (such as cash, gift card, and coupons) and material (such as gifts and prizes other than money). These studies tend to further characterize incentives by when they are offered: prepaid incentives (paid to participants before, and regardless, of whether they participate or complete the survey), promised incentives (provided to participants upon completion of the survey), and raffles and lotteries (typically involving participates being included in a drawing for a larger prize). The majority of research reveals that prepaid and promised financial incentives are the most effective in raising response rates, while the lottery and raffle incentives raise response rates to a lesser degree. Compared to monetary incentives, material incentives tend to be less effective regardless of when offered, unless the non-monetary incentive is highly valued by the target group of participants. In general researchers also find that when provided an incentive first, participants feel an obligation to reciprocate by participating in the survey.

While its clear incentives increase response rates to varying degrees, the more practical questions governing incentives are (1) what, and how much, incentive should be offered, (2) will the incentive create bias or “bad data?” A good place to start is carefully considering the target audience and the nature of the survey. Is the survey salient to your targeted participants? Is it burdensome? What will participants value as an incentive? These kinds of factors will influence the effectiveness of a given incentive program. For instance, motivating a group of well-paid professionals to respond to a long survey will likely require a much more valuable incentive than would a short survey of the general population.

The budget for the survey must be also be considered as a limiting factor in the type, timing and amount of a survey participation incentive. Some have suggested that for small business surveys (with less than 500 participants) a higher incentive (e.g. $10 to $50) is required to get a response rate high enough (e.g. 40% or more) to ensure an adequately representative sample size, while for larger survey populations a lower incentive may suffice. Response rates for individuals, as contrasted with companies, have been shown to be significantly increased by even small prepaid monetary incentive (such as including a dollar bill in a mailed survey). This does not mean, however, that offering huge prepaid incentives to increase response rates is the easy answer. When designing incentive programs, the dual objectives are both cost-efficiency and effectiveness of the incentive program.

A partition incentive program should be designed to generate the highest increase in response rates for the lowest marginal cost while also maintaining data quality. For example, if a monetary participation incentive is disproportionally large to the burden of the survey, many will participate in the survey merely for the incentive, without providing thoughtful answers. At some point large incentives can be considered “coercive” and a taint to data quality. Likewise, if the perceived value incentives greater for one sub-group of participants, it can create biases that will call the survey’s data quality into question. An easy example is the expected demographic bias in favor of female participants if a non-monetary incentive of lipstick is offered in exchange for survey participation.

There is no single answer for calibrating survey participation incentives. While financial and material incentives work, considerations of the target audience, the nature and design of the survey, the logistics of paying the incentive to participants, the burden of the survey on participants, and budgetary limitations must be considered.

If you are interested in learning more about the creation of effective customer experience surveys please contact the team at AfterWords.