X

Two Polls on Jeff Sessions and Russia: Misreading the Public

Two new polls about Attorney General Jeff Sessions provide a prime illustration of how polls can create the illusion of public opinion.

According to a new Quinnipiac poll, a majority of American voters believe that Attorney Jeff Sessions lied under oath during his confirmation hearings, and by a 51% to 42% margin believe he should resign.

According to a new HuffPollster/YouGov poll, a plurality of American adults, 39% to 30%, believe that Sessions should resign.

Both polls find a 9-point margin in favor of a Sessions resignation, but there is a significant difference in the percentage of respondents who express no opinion – 31% in the HuffPollster/YouGov poll, just 7% in the Quinnipiac poll.

Why such a large difference? The Quinnipiac poll was conducted by phone and did not offer the respondents a “no opinion” option, thus pressuring them to come up with an opinion. In contrast, the HuffPollster/YouGov poll, conducted online, did offer a “no opinion” option. Many people will admit ignorance if given the chance.

Neither poll, however, measures how strongly people feel about the issue. Typically, many respondents in a poll will express a top-of-mind opinion to the interviewer, but if asked a follow-up question, will admit they really don’t care one way or another. Thus, the percentage of Americans actually clamoring for Sessions to resign is likely to be significantly below even the 39% reported by HuffPollster.

 

Feeding Respondents Information

The problem of measuring a realistic public opinion is complicated further by the fact that both polls fed respondents information and then asked them to offer an immediate opinion on the issue. Quinnipiac’s approach was the more imprudent.

First, it told respondents that Sessions had stated in his confirmation hearings that he did not have any communication with Russian officials while working with the Trump campaign, and that later it was discovered that he had met with the Russian ambassador during that period. Then the poll asked, “Do you think he lied under oath about this issue, or do you think he made an unclear statement without lying?”

For most respondents who had not been paying attention to the news, this formulation of the issue would no doubt have led them to the conclusion that Sessions was lying. They would not have heard Sessions’ defense – that he had been confused by the question asked him by Senator Al Franken during the hearing.

By providing limited and negative information about Sessions, Quinnipiac tainted its sample, so that it no longer represented the larger population of voters. Then, having just implied that Sessions had lied, the poll asked whether respondents felt Sessions should resign. It was not surprising that a majority said yes.

The HuffPollster/YouGov poll was not as explicit in its bias against Sessions, though it too provided limited, negative information. To its credit, the poll did use the information to ask how closely people had been following the issue, reporting that about four in ten were paying little to no attention, and less than three in ten were following the news about that story “very closely.”

The problem with the question wording is that it gives information to the respondents, who are supposed to be a representative sample of Americans across the country. But not all Americans across the country have heard of the controversy over Sessions. As this poll question reveals, at least four in ten respondents had not heard about Sessions prior to the poll, and probably even among those who said “somewhat closely,” there were many who hadn’t known the details about the controversy.

Thus, like the Quinnipiac poll, by giving information to respondents, the sample in this poll was tainted – it no long represented the larger population.

Is there a way to ask how much people have heard of the Sessions controversy without tainting the sample? The fact is that any poll, especially long ones with many questions, have the potential to taint the sample. Nevertheless, there are better and worse ways to minimize the problem.

In this case, one could have asked respondents how closely they had been following the controversy over Jeff Sessions’ testimony during his hearing, without mentioning what the controversy was about.

The next HuffPollster/YouGov question asked how serious was Jeff Sessions’ conduct? Everyone was asked that question, including the 41% who said they had heard little to nothing about the issue. This tactic means the pollster was interested in eliciting an opinion even from people who didn’t know about the issue and whose only information would be what HuffPollster/YouGov provided.

Again, this means the sample no long represented the larger population of American adults. The sample would base its response on what the poll interviewer just told them, while the American people at large – if theoretically they could be asked that question – would have to base their responses on what they personally knew. Some would know about the Sessions controversy, and would have heard both positive and negative interpretations, while many would not have heard anything. In any case, the pattern of what people at large would have heard would be different from what the sample of respondents were told.

The third question once again informed the sample that Jeff Sessions had recused himself from participating in any investigations related to the 2016 campaign – and immediately asked for respondents’ reactions.

Finally, the poll asked if Sessions should or should not resign. For many respondents, the only information they would have had about Sessions is what HuffPollster/YouGov just fed them. Yes, even then, 31% opted for the “unsure” answer, but among the 69% who responded, many would have been influenced in their decision solely or principally by the limited and biased information provided in the poll itself.

This is not “public opinion” as it actually exists among the American public.

The Quinnipiac results are clearly more tainted than the HuffPollster/YouGov results, but both present “manufactured” public opinion – based on giving respondents limited information so they can come up with an immediate “opinion” the pollster can report.

The truth is that on many issues of the day, most Americans simply don’t have the time to get engaged and formulate meaningful opinions. Pollsters and the media are generally unwilling to acknowledge this reality. Instead, even the best of them continually bombard us with poll results that have little relevance to what the public is – and is not – thinking about.

Such polls may provide fodder for the daily “news,” but they mislead us as to what public opinion really is.