8 MIN READ

What is an Average Survey Response Rate?

Emily James

5 Design Principles To Enhance Your Market Researc...

Design is something we all use, whether consciously or unconsciously in the insights industry. Desig...

8 MIN READ

Sophie Grieve-Williams

    A survey response rate of 80% and over is akin to the divine-standard for survey research, but it is extremely rare that researchers achieve that response rate due to the nature of market research and its respondents. It is therefore a researcher’s job to make it as easy and engaging as possible for respondents to complete the survey, allowing them to become thoroughly immersed with the task at hand.

    But what is an average response rate, and what factors determine the response rate of a survey?

    Defining the Average Survey Response Rate

    An average survey response rate is pretty much what is says on the tin: the average number of participants to complete a given survey to it’s fullest.

    This rate is calculated through two variables, the number of invitees and the number of responses. The response rate is the result of the number of responses divided by the number of people, and is typically reported as a percentage which can be achieved by multiplying the result by 100.

    Determining Factors

    I once was asked to complete a 120-question survey, mostly containing multiple choice option questions, but there were a hefty amount of open-ended questions to get through that it took me an hour of picking it up and setting it down for breaks to complete it. Not the best survey I’ve taken, but unfortunately not the worst either. There are many factors that determine the response rate of a survey, for example, the length of a survey, the tone, the phrasing, and the answer options; all of these aspects and more can become a factor in the resulting response rate.

    Tweet from FlexMR Tweet This
    There are many factors that determine the response rate of a survey; for example, the length of the survey, the tone, the phrasing, the answer options available, and many more even after the research design process.

    As is an obvious conclusion from my little anecdote above, length is a very important factor within a survey. If the participants are part of a long-term study or even a short-term but intense study, then the amount of surveys that the participants receive and complete will affect their willingness or the effort the participants put into the survey. But the questions within a survey are another factor that can determine how long a survey can be. If the survey is composed of closed or multiple-choice questions, then the survey can stand to be a little longer than if it was composed purely of open-ended questions that require a long, written answer.

    Also, in regards to survey design, the phrasing of the questions, the tone of the survey, is also very important. Clear concise questions are easily understandable and so much better than long-winded or flowery-worded questions, which can easily lose the meaning or the instructions of the question.

    Once the survey has been created, there are a few ways in which surveys can be delivered, but choosing the best way for the sample will increase the survey response rate. There is no best time to send out these surveys, but the time between the surveys will have an effect on the response rate of the latter ones delivered. Intense data collection-heavy projects tend to have a large number of research tasks that are designed with the idea of quantity over quality, which the respondents will immediately see.

    Respondents can tire easily of research, especially if they are fitting it in with a busy schedule, but having a good amount of time between the research tasks will allow a bit of a rest period and will work to make respondents feel more comfortable and not pressured when an email or notification comes through with yet another survey.

    Improving a Response Rate

    Luckily, there was ways in which researchers can improve response rates for surveys. After identifying the several factors within a survey that prevent a participant from completing it, researchers can fix any issues they may find through focusing more on the participant experience.

    The best way to start is within the initial design phase of the survey. Take into account how this survey will meet provide data that will provide answers to the research objectives and then what the survey needs to do to achieve this. Some factors to remember in this stage are:

    1. Only take the data the client needs. Taking more data than is needed can provide a very real data security risk, but will also alter the design of the survey into something lengthier and thoroughly unnecessary.
    2. Think about the surveys that the participants have already completed recently (if they have done so), how can you make this one different and engaging?
    3. What medium will the participants be completing this survey? Optimising surveys for mobile, tablet, and desktop will ensure that no matter what way a participant completes the survey, they will have the same level of effective user experience as everyone else.
    4. Are all possible answers available to the respondent? Because if they don’t find an answer that they feel is accurate or relevant to them, they will quickly lose interest and choose any answer, which results in inaccurate data.

    However, once the survey is complete, there are more ways in which researchers can encourage survey responses. Incentives, for one example, are a researcher’s not-so-secret weapon when it comes to encouraging responses from respondents. The type of incentive needed really depends on the audience of the survey, but some high-profile generic incentives that work fairly well are monetary- or free-related incentives. Gift cards are a safe way of delivering money to the winning respondents.

    There is a misconception that the more participants you send the survey out to, the more responses you are likely to get, but that doesn’t play out unless you design the survey right in the first place. The more respondents you send a badly-designed survey to, the larger the drop out rate is going to be, and the less insights you are likely to get, no matter how good the incentive may be.

    In Summary

    Average surveys will generally receive a 30% response rate depending on the method of delivery. The source behind this stat also states that an internal survey receives a higher response rate than an external survey, and the rate will increase by 14% after 3-7 days after the survey was sent.

    Tweet from FlexMR Tweet This
    It's a researcher's responsibility to make sure that every survey sent out is the best it can be to gather enough accurate and high-quality data that is worth drawing insights from.

    However, each survey response rate will be different depending on the factors outlined above, and it is a researcher's responsibility to make sure that the survey being sent out is the best it can be in order to gather enough accurate and high-quality data to draw insights for the research objectives.

    You might also like...

    Blog Featured Image Header

    Delivering AI Powered Qual at Scale...

    It’s safe to say artificial intelligence, and more specifically generative AI, has had a transformative impact on the market research sector. From the contentious emergence of synthetic participants t...

    7 MIN READ
    Blog Featured Image Header

    How to Use Digital Ethnography and ...

    In one way or another, we’ve all encountered social media spaces. Whether you’ve had a Facebook account since it first landed on the internet, created different accounts to keep up with relatives duri...

    7 MIN READ
    Blog Featured Image Header

    5 Ways to Power Up Your Insight Pla...

    In case you missed it (which seems unlikely), ChatGPT, the Artificial Intelligence model trained for conversation interactions has been making waves in the last few months. But once you’ve finished as...

    7 MIN READ