Response Bias: Types and Prevention Methods


Table of Contents

What is Response Bias?

Response bias (also known as survey bias) is defined as the tendency in respondents to answer untruthfully or inaccurately. It often occurs when participants are asked to self-report on behaviors, but can also be caused by poor survey design.

Respondents may or may not be conscious of the way in which they’re answering, but your data collection will be affected just the same.

Unfortunately, it’s not always possible to prevent or detect bias. But this article will help you identify them when you create a survey

Types of response bias

It’s important to be aware of the different types of response bias, as they can be pervasive in your reach, regardless of your sample size.

1. Social response bias

Also known as social desirability bias, respondents affected by this will often over-report on good behaviors and under-report on bad behaviors.

Here are a few things that may be misreported:

  • Abilities and skills
  • Personality
  • Sexual behavior
  • Religion and spirituality
  • Financial earning
  • Unlawful behavior

Respondents will answer in this way to appear more socially desirable, e.g. higher salary.

How to avoid social response bias:

The best way to address this form of bias is to assure participants that their responses will be anonymous. In doing so, you’ll encourage more open and honest feedback.

KwikSurveys is a privacy-first survey maker, meaning all responses are anonymous by default.

KwikSurveys Survey Maker Logo
A powerful privacy-first survey maker
Sign up for our Free 14-day Trial and start creating surveys your respondents can trust.
Start for FREE

2. Non-response bias

Also known as Participation Bias, this occurs when a survey sample is non-representative of the target population. In these cases, the opinions shared by those respondents are disproportionate to that of the larger population.

This will results in a biased data set and may impact your research outcomes.

How to avoid non-response bias:

To reduce the risk of this bias occurring, share your survey across a range of platforms and to a diverse group as possible. E.g. social media, your website, via email invitation.

3. Prestige bias

Many use the term ‘Prestige Bias’ to mean Social Desirability Bias, but we believe it to be its own form of survey bias. It arises when survey respondents are asked about their social, educational, and financial statuses, and will incur inflated responses.

Respondents may want to be perceived to have:

  • More social power
  • An advanced level of education
  • More favorable financial situation

Respondents affected by prestige bias will differ in their responses, as what is considered prestigious changes from culture to culture.

How to avoid prestige bias:

Neutrally worded survey questions will reduce the occurrence of Prestige Bias, but not eliminate it altogether. Wanting to be thought of as prestigious is a staple of the human condition.

4. Order effects

Response bias can be caused by the order of your questions. 

Here’s an example for employee satisfaction surveys:

If you ask employees about issues with their line manager before you ask how happy they are in their role, their answer to the second question will be influenced by their first response. If you reversed the order of those questions, their answer may well change.

There are two kinds of effects this type of bias has:

  • Contrast Effects: The order of questions results in greater differences between respondent answers.
  • Assimilation Effects: The order of questions results in answer selections becoming more similar between respondents.

How to avoid order effects:

This is one of the most difficult forms of response bias to avoid. Ordering your questions perfectly takes time, and not all survey creators have time to spare. 

However, if you run a pilot or test survey, you may be able to identify Order Effects and address the issues before you launch.

5. Recency bias

Recency Bias concerns those respondents who simply pick the last answer they read. Most often, respondents who show this tendency have already disengaged from your survey.

How to avoid recency bias:

The best way to avoid this is to ensure your survey design is up to scratch. But, occasionally respondents will disengage regardless of your hard work.

You can address this type of response bias by randomizing the order of your answer options. This won’t stop it from occurring, but it will distribute the bias evenly between answers.

6. Hostility bias

Asking respondents about unpleasant memories or negative experiences can provoke them into becoming hostile. Examples of this would be questions concerning divorce, debt, and death.

How to avoid hostility bias:

If you can, avoid sensitive subjects (or those which may elicit negative/ hostile responses).

Alternatively, you could explain why you’re asking those questions and how you’ll use the survey data. This will prepare respondents for your line of questioning, reducing the shock factor of sensitive questions.

7. Satisficing

Satisficing is a combination of ‘satisfy’ and ‘suffice’, meaning: ‘what is sufficient to obtain a satisfactory outcome’.

I.e. Respondents who only meet the minimum requirements of a survey; which is to submit a response.

Respondents who display this form of response bias are likely to leave questions unanswered or to answer dishonestly.

Types of satisficing

There are a few types of satisfying that may occur in surveys, we’ll go through them below:

7.1. Speed runs

Some respondents speed through surveys without paying attention to your questions.

These instances tend to occur when an incentive is being offered for participating in research. Respondents may rush through your questions just for a chance to win the prize.

However, it could also be a result of survey fatigue, where participants have become fatigued and rush through to finish.

How to avoid speed runs in surveys:

Think carefully about whether to offer an incentive and ensure your survey design is up to scratch.

7.2. Self-selection bias

This concerns those respondents who intentionally inject themselves into a study. It’s similar to non-response bias as the respondent pool elicits a different set of responses to those who aren’t responding.

How to avoid self-selection bias:

Restrict access to your survey by using a password or distributing directly to your desired respondents.

7.3. Non-differentiation (straight-lining)

Where scaled questions are concerned, there’s a risk of respondents failing to differentiate between answer choices. In these cases, they may give identical, or similar, responses to all questions using the same scale.

E.g. If all your scales are measured ‘Very Bad’ to ‘Very Good’, they may choose ‘Good’ for every question.

How to avoid straight-lining:

If you need to use scaled questions repeatedly, reverse the scale for each question. This will force respondents to read the answers carefully each time, or at least evenly distribute the bias.


Scale 1:

  • Strongly Disagree – Strongly Agree

Scale 2:

  • Strongly Agree – Strongly Disagree

Or you could mix up the question types being used for each question. This will keep respondents on their toes.

7.4. Acquiescence Bias

This is where respondents only select positive answer choices. It is also known as ‘yea-saying’. E.g. ‘yes’ or ‘strongly agree’.

How to avoid acquiescence bias:

Acquiescence bias is very similar to non-differentiation bias, and so is the solution to prevent it.

Flip the extreme points of the scale for every other scaled questions and randomize the order of answer options for multiple-choice questions.

7.5. Neutral answer selection

This concerns respondents who continually select the neutral answer options. These include options such as ‘Don’t Know’, ‘N/A’, and ‘No opinion’.

How to avoid neutral answer selection:

It’s best practice to include neutral answer or opt out answers (N/A or No Opinion) for survey questions. So, removing them isn’t an option (and neither is flipping the scale).

However, the reason respondents may be selecting neutral answer options may be that your answer options are not inclusive of their opinion.

Maybe your scales lean too heavily towards positive or negative attitudes. For example:

  • Extremely Good
  • Very good
  • Good
  • Okay
  • Extremely bad

Above, respondents are clearly presented with more positive choice than negative choices. They might not believe the subject of your question to be extremely bad but are missing other options to express their opinion.

Alternatively, your scales may have options that are too similar for respondents to distinguish between. For example:

  • Excellent
  • Brilliant
  • Okay
  • Awful
  • Terrible

This is a rudimentary example, but it does highlight a common issue. If respondents can’t identify an clear difference between answer option they’ll express a neutral opinion rather than answer incorrectly.

To avoid cases of neutral answer selection, ensure that the survey answers you provide are balanced and clear.

7.6. Extreme responding

This mostly concerns scaled questions (e.g. Likert Scales/ Matrices), where options such as ‘Strongly Agree’ and ‘Strongly Disagree’ are available. 

Respondents may simply provide extremity responses for each scale, rather than consider all the options.

It’s often a result of leading questions or loaded words, making people feel they need to fully agree or disagree.

How to avoid extreme responding:

Ensure your question wording is as neutral as possible, to allow respondents to make an informed choice. If you’re still concerned about this type of response bias, ensure to reverse scaled question in your survey to balance out the effect.

7.7. Primacy effects

This pertains to respondents that select the first available answer option for each question.

How to avoid extreme responding:

By randomizing the order of your answer options, you’ll decrease the number of times one answer can be chosen. In doing so, you will stop your survey results from being too unfairly weighted towards one option.

8. Sponsorship bias

When respondents are aware of survey sponsorship or branding, their perception of that organization can influence their survey responses.

How to avoid sponsorship bias:

If you have to include your sponsor’s branding, then save it for the thank you page. This way, respondents will still see the branding, but it won’t influence their choices.

9. Stereotype bias

Personal questions (such as technical skill, nationality, gender, and age) may reinforce or trigger stereotypes. This will lead respondents to fulfill that stereotype in their response.

E.g. If a question implies that younger people are better with technology, that idea will influence their responses.

How to avoid stereotype bias:

Frame all questions and answers as neutrally as possible. Allow participants to explore their own ideas or opinions on the subject.

10. Recall bias

How we remember things isn’t always reliable or accurate. As the way we feel or think about something changes over time, our recollection of the past can become warped and we tend to align those memories with our current thoughts and beliefs.

E.g. If asking for event feedback too long after the event took place, attendees may not be able to remember what aspects were most useful.

How to avoid recall bias:

The best way to combat recall bias is to send surveys that ask about events or experiences as close to the event or experience itself.

For example, if you’re running an event survey, you’ll want to contact your attendees within a few days of the event ending.

This way it will be fresh in their minds and is unlikely to be affected by recall bias.

If the event is occurs more frequently, e.g. daily meetings, you don’t have to rely so much on the timing of the survey, because repetition improves recall.

11. Demand characteristics

This is where respondents alter their behavior to align with how they believe the ideal research participant should respond.

This is a large topic in itself, so you may want to read more about demand characteristics.

How to avoid it:

Distance participants from the hypothesis/ aims of your research if you believe it could elicit this type of bias.

You can still share the aim of your research, but don’t tell respondents anything that could influence their answer choices.

Wrapping up

Response bias (survey bias) is a pervasive issue, but we’ve provided some great methods for avoiding it above.

You can further reduce response bias by following the best practices for conducting survey research.

We also highly recommend testing your survey with a pilot audience. This gives you the opportunity to review your test data set and identify any biases incurred by your survey design.

Share this post