Why journalists should look at question order when covering survey and poll results

We explain how question order bias can affect how people answer questions. We also offer five tips to help journalists spot the problem.

Posted

Researchers conduct surveys and polls to collect a wide variety of information, including how many Americans own guns, how substitute teachers feel about working at certain schools and which presidential candidate U.S. voters prefer. Not only does this type of data help researchers answer pressing societal questions, government agencies often rely on it when designing policies or prioritizing projects.

However, polls and surveys vary in quality, sometimes significantly. Because biased data is bad data, journalists need to be aware of factors that can influence results. A key source of bias is question order — the sequence in which researchers ask questions can elicit different responses, a phenomenon experts refer to as question order effect or question order bias.

An example: When researchers asked a nationally representative sample of U.S. adults their opinions on corporal punishment of children, people responded differently depending on whether they were first asked a question about domestic violence.

Political psychologist David C. Wilson analyzed the results of the survey, which Princeton Survey Research Associates International and Princeton Data Source conducted in late 2014. Nine hundred participants answered questions that varied in order.

When participants were asked whether it is sometimes acceptable for a parent to spank a child, 86% agreed or strongly agreed, Wilson writes in a 2018 paper. But when other participants answered a question about the acceptability of a man hitting a woman before answering one about corporal punishment, 70% agreed or strongly agreed it is sometimes acceptable for a parent to spank a child.

People responded differently because thinking about domestic violence first “affects how individuals think about corporal punishment by priming the similarities between the two,” writes Wilson, the dean and a professor at the Goldman School of Public Policy at the University of California, Berkeley.

Wilson has written several academic papers investigating question order bias. In an earlier paper, he and three colleagues examine the results of a Gallup poll about affirmative action, conducted with 1,385 U.S. adults. They found that participants were more likely to support affirmative action programs for racial minorities if they were first asked a question about affirmative action programs for women.

The paper also reveals people were less likely to support affirmative action programs for women if they were first asked about affirmative action for racial minorities.

“The results suggest that for the American public as a whole, support for one type of AA [affirmative action] program is indeed affected by whether that program is considered by itself or in the context of both types of AA programs,” Wilson and his coauthors write.

Tips for journalists

Journalists covering polls and surveys need to scrutinize the questions asked and look for evidence of question order bias. Below, we’ve outlined five tips to help journalists avoid reporting on problematic surveys and flawed results.

Two experts helped us create this tip sheet: Scott Keeter, a senior survey advisor at Pew Research Center, and Chase Harrison, associate director of the Program on Survey Research at Harvard University’s Institute for Quantitative Social Science.

1. If the results of a poll or survey differ significantly from what others have found, examine its methodology — including its question order.

While public opinion can change quite a bit over a short period, that’s usually not the case, Harrison says. If a poll or survey shows a large, sudden shift in people’s feelings, attitudes or habits, that could be a signal that something in the questions prompted people to answer them differently than they ordinarily would.

“Quite often, when I hear or see a result that doesn’t match what my general understanding of the issue is, I go back to look at the questions and there often is a question order effect,” Harrison says.

If the results differ substantially from what other surveys have found, Harrison recommends journalists contact researchers with expertise in the survey subject. They will know whether the findings are surprising, or unlikely. Meanwhile, experts in survey research methods can help journalists evaluate the questions.

2. Get a copy of the questions in the order they were asked.

Many researchers and research organizations provide the list of questions they asked along with their results. For example, the Pew Research Center recently conducted a survey to determine the percentage of U.S. adults who go to Tiktok regularly for news. Pew included a link to its survey questions and participants’ answers at the bottom of a Sept. 17 report that explains the survey results.

If you have trouble finding the questions used in a survey or poll, ask for them. They often will be provided.

“Good researchers will do this,” Harrison says. “All the reputable researchers do this.”

The American Association for Public Opinion Research launched its Transparency Initiative in 2014 to encourage researchers to disclose how they do their work. To become a member, research organizations commit to publishing key details about their methodology at the same time they release results.

One item members must share is their questionnaire showing the questions asked in the order they were presented as well as the options participants were given for answering the question.

Keeter suggests news outlets avoid covering research from entities that do not support and follow open science practices. Almost 100 organizations have joined the Transparency Initiative, including Pew, ABC News, Cornell University’s Survey Research Institute, Google Surveys, The Huffington Post and The Washington Post.

"There are things good polls do,” Keeter adds. “And if you can’t find out whether a poll does those things or not, then maybe you should not write about it.”

3. Familiarize yourself with the characteristics of polls and surveys that are most likely to have question order bias.

Get in the habit of asking about question order bias when interviewing researchers about their work. If researchers assert there is no evidence of bias, ask them to explain how they know there is not.

Keep in mind that surveys and polls with certain characteristics often have question order issues. For example:

  • Surveys and polls that cover a single subject are more likely to have a bias than those that cover various subjects. “If the entire topic of the questionnaire is about a certain thing, you really want to pay attention to the earlier questions," Harrison warns.
  • If the goal of a survey or poll is to gauge support for a particular political candidate, it should start with a question asking which candidate the participant favors. Otherwise, a person’s choice could be influenced by an earlier question. “It’s true most people know who they’re going to support, but not everyone does,” Keeter says. “Some people are tilting or undecided. Those individuals are more susceptible to the context in which you ask the question about candidate support.”
  • Surveys and polls conducted by groups with little experience doing this type of research probably are more likely to have question order issues than experienced researchers at organizations that have joined the Transparency Initiative.

Keeter says journalists should be leery of survey results promoted by researchers from advocacy groups such as political action committees and entities that might misrepresent their findings to attract media attention.

“The barriers to entry into the profession have become lower and lower and lower over the past couple decades as online polling has become more accessible to everyone,” he notes. “Journalists need to be on guard not to allow themselves to be used by people fresh on the market who don’t have a track record and don’t have the same set of incentives in terms of professional credibility.”

4. Ask researchers what steps they took to avoid question order problems.

Because question order bias compromises data quality, researchers have developed a range of strategies to help them avoid it or limit its impact. Common strategies include:

  • Asking the most important question first. People answering this question first cannot be influenced by an earlier question.
  • Randomizing the order of questions. While the most important question will not always appear first, presenting survey questions in random order will dilute the effect of any question order bias that is present.
  • Doing a trial run of the survey with a small group of people. This way, researchers can see how people respond to questions and make changes, if needed, before presenting them to a larger group.

5. Don’t rely on press releases to describe survey and poll results.

Harrison urges journalists to review the details of a survey or poll themselves and not rely on descriptions provided in press releases. To accurately characterize people’s responses, journalists need to know how questions were worded and the context of questions asked earlier in the survey or poll.

“Press release writers and survey analysts sometimes get things wrong and misdescribe what the data actually says,” he explains. “That’s something a journalist can easily figure out for themselves by reading the questions and trying to write about people’s opinions in a way that matches the questions as closely as possible.”

Further reading

This article first appeared on The Journalist's Resource and is republished here under a Creative Commons license.

Comments

No comments on this item Please log in to comment by clicking here


Scroll the Latest Job Opportunities From The Media Job Board