Wednesday, May 3, 2023

How polls can mislead about polarization. By Natalie Jackson

How polls can mislead about polarization
We probably don’t really know how divided we are, thanks to the way many surveys structure their questions.

Adobe Stock
31 minutes ago
By definition, a political pollster is not a “normal” or “average” person. Political pollsters, like most political creatures, are usually deeply embedded in the divisive partisan atmosphere. We take in all the information, and then we try to design questions to probe the “normal” people about this atmosphere, of which they may only be tangentially aware.

Public polls have been asking more questions about so-called culture-war issues—particularly as red state and local governments pass legislation on abortion, transgender medical care, and what children learn about race and gender. I’m not sure we’re learning much from some of these questions, though, because the questions are most often framed in the language of the debate among political creatures rather than what “normal” people experience.

An example popped up in a recent NBC News poll: “Which should be a more important goal for our society these days: promoting greater respect for traditional social and moral values or encouraging greater tolerance of people with different lifestyles and backgrounds?” Half of the respondents chose traditional values, and 42 percent of the sample chose tolerance. Five percent of respondents went off-script and said that both matter equally, and 3 percent couldn’t pick.

The juxtaposition of “greater tolerance” and “traditional values” breaks key rules of survey question writing: Response options should cover most plausible opinions (“exhaustive”), and the respondent should not logically be able to choose more than one of the options to match their opinion (“mutually exclusive”). That’s because when we conduct a survey using “closed-ended” questions—meaning that respondents are supposed to pick from pre-selected options—we are inherently restricting the opinions that people can provide, but we still want to accurately capture their opinion.

In the case of the “traditional values” vs. “tolerance” question, the answer could easily be both—and 5 percent of respondents did say this. However, it’s important that they weren’t provided “both” as an option—the survey was done by telephone, so the interviewers were able to note that the respondent said “both.” This is rare, though, if the option is not explicitly given. Decades of survey responses show that people don’t often go off-script, and if there isn’t an interviewer conducting the survey—as in all web surveys—there is no way to register an off-script response.

“Both” and “don’t know” options are often not offered to respondents because the conventional wisdom is that if you give them that option, they will take it as a cop-out instead of their real opinion. Of course, that’s based on the assumption that they: (1) have an opinion that (2) can be accurately expressed in one of the answer options. We’re not representing their opinions very well if respondents hold different opinions than the answer options offer, or if more than one answer option can describe their opinion, or if they don’t have an opinion and we force them to choose.

By not providing a “both” or “don’t know” option to respondents, the question was explicitly designed to force respondents to pick a side in culture-war messaging. These two options are based on partisan talking points, and not surprisingly, the question got exactly the divisive results it was designed to elicit: Most Democrats chose “tolerance” and most Republicans chose “traditional values.” But “traditional values” and “tolerance” are not mutually exclusive choices, and they lack specificity: What is a “traditional value”? What are “different lifestyles”?

A lot of political polling follows this pattern of forcing choices between extremes, and it means that what we’re doing is less about finding out what people think and more about finding out which extreme they find more palatable.

That is perfectly fine—there is value in knowing what is more palatable, and respondents will typically fall in line and pick one side or the other. The problem is that their answers don’t necessarily mean to respondents what they do to the twitterati and political junkies. These results are reported and interpreted as “Look! This is what the American public wants!” Or “look how divided we are!” In reality, we didn’t ask what they actually wanted, and we didn’t give them a chance to not appear divided.

There is a way to ask these questions and gain understanding of what people think. A recent USA Today/Ipsos survey asked about the term “woke,” by asking first whether people had heard much about the word, followed by an open-ended question about what the word means to them, and finally if they would consider it an insult or a compliment. Notably, the latter question included a “don’t know” option, which a full quarter of the sample selected. Only then came the binary, forced-choice question with partisan valence. That question still forces opinions where there might not be one—but at least we have also uncovered what respondents think it means first.

At some point pollsters have to grapple with the fact that if we write divisive questions, we will get divisive answers, and we feed more divisiveness into the cycle. That’s useful for generating headlines and attention. It’s not necessarily very useful for finding out what people really think.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.