Are Bad Choice Options Biasing Your Research Results?
/One of the important things researchers need to keep in mind when creating questionnaire is: how to frame choice options. This is because choice options can inadvertently affect responses.
Consider this scenario:
Imagine that the U.S. is preparing for an outbreak of unusual virus, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. |
And your job as a researcher is to poll the public’s opinion.
You learned that one of the alternative programs comes with certainty (e.g., no-risk option): it will save 200 people for sure. The other program comes with risk: there is 1/3 chance that everybody will be saved, and 2/3 chance that no one will be saved.
How would you describe these two options?
One researcher drafted the following options and asked people to choose between Program A and B.
Program A: 200 people will be saved.
Program B: |
The majority (68%) of the participants chose Program A. Now you can report to your clients that the majority of people would want to go with the no-risk program. Simple right? Not so fast.
The same researcher described the programs in slightly different way and gave it to a different group of people:
Program A: 400 people will die.
Program B: |
When participants were presented with the revised set of Programs, the result was completely the opposite: the majority of people (73%) chose Program B.
What might have happened here?
Program A in the first set and the second set are identical; they were just framed differently. Yet, a majority of people chose A in the first set, and only a quarter of people chose the A in the second set.
The way the program is framed, or worded, shifted people’s attention to different aspects of each option. In the first set of options, people focused on the lives they can save, and in the second set of options, people focused on the lives they would lose.
As a researcher, what could you have done differently?
My suggestion is to present both sets of information:
Program A: 200 people will be saved and 400 people will die.
Program B: |
Based on the past research, 44% chose Program A when both side of information was provided. This seems like a more reasonable result. Decades of research on decision makings tell us that people are, on average, somewhat risk averse (e.g., dislike uncertainty), and therefore it makes sense that slightly more than half of the people chose to option with certainty.
So, what does this mean to you?
Research professionals:
If you are research professionals, then you know it is extremely important not to bias participants’ responses. However, it’s much easier said than done. As I demonstrated above, a small change can lead to dramatically different results. I encourage you to take a look at your last project and brainstorm how you might have been able to frame some of the choices differently. Do you think you’d have gotten different results if you worded differently? If so, which framing do you think would produce more accurate responses?
For rest of the folks:
If you do not conduct survey work extensively, then they might appear very simple and easy to conduct. However, as I have demonstrated above, it’s very easy to obtain biased results. Here’s a couple of tips to ensure you have unbiased data to rely on.
1. Be sure to know exactly how the data was collected.
This is important because you will know the contexts of the data, and possibly spot the sources for biased data.
2. Be sure to work with research professional whenever you are conducting survey research, or even doing a quick polling on your team.
I know it’s tempting to say “I’ll just put together a quick survey”. But if it’s not done right, it might lead you to rely on inaccurate data. As I always say to my clients, bad data is worse than no data.
Is there any survey work you’d like us to review? If so, contact us at namika@sagaraconsulting.com. We'll be happy to take a look for you.
If you are interested in taking our online course on behavioral economics for market researchers (partnering with Research Rockstar), click here.
Interested in learning more about how to apply behavioral economics to market research projects?
Or, check out our online course on behavioral economics for market researchers (PRC eligible)!
Reference:
- Tversky, Amos, and Daniel Kahneman. "The framing of decisions and the psychology of choice." Science 211.4481 (1981): 453-458.
- Druckman, James N. "Evaluating framing effects." Journal of Economic Psychology 22.1 (2001): 91-101.