VIDEO: How Accurate Is Political Polling? (with Lesson Plan)

Save ArticleSave Article

Failed to save article

Please try again


It's here. It's finally here.

When it comes down to it, Election Day is the ultimate political poll -- the definitive survey -- when millions of campaign-weary Americans  finally get to cast their ballots and (hopefully) lay to rest this bitter, exhaustive presidential election.


But as the results filter in, it's also a good moment to consider the accuracy and influence of that massive tsunami of polling data that we've been inundated with for the last year.

Polls generally refer to surveys of public opinions and forecasts of election results. And since the 1990s, when major news organization began conducting their own polls, the polling business has been booming. Today it's a billion-dollar industry with an army of polling firms cranking out thousands of surveys each year. Political candidates and elected officials also now typically commission their own polls to gauge approval ratings and messaging impact.

But it wasn't always like this.

Modern polling was pioneered in the 1930s by George Gallup, a statistician who began conducting surveys using a statistical model he called "quota sampling" to predict election outcomes and measure public opinion. As Harvard historian Jill Lepore explains in her New Yorker article, the relatively small group of respondents that Gallup selected to randomly sample for each pollreflected a mini-electorate, demographically proportionate and representative of the larger voting population (same percentages of men, women, black, white, young, old, conservative, liberal, etc.). Lepore says that Gallup believed polling was essential to democracy as a tool to gauge the will of the people. And for decades,  Gallup's organization, and a small group of others, were among the only firms producing these kinds of polls.

Lepore notes that back then, the response rate among those surveyed was remarkably high, at roughly 90 percent. Today, however, the average poll response rate is in the single digits.

Among the biggest factors at play, she says, is the widespread adoption of mobile phones and the move away from landlines. The majority of polls are still conducted by phone. And because federal law prevents auto-dialing to cell phones, it's become significantly harder and costlier to reach the adequate number respondents necessary to generate a representative sample of the electorate.

As a result, a growing number of polls are conducted through websites and social media platforms. These are usually opt-in polls, in which site visitors actively choose to participate (as opposed to being randomly called on), and are generally considered less reliable Those who choose to respond to online polls are rarely representative of the larger electorate, and so results can be biased and misleading.

You can think of a "sample" as a small model of the larger population. The goal in sampling is to use that smaller subset to represent the larger whole. Random sampling simply means that each member of the larger population has an equal chance of being included in the sample. Generally (although not always), the larger the sample size, the more accurate the poll. The average poll has a sample size of 1000 adults, according to the National Council on Public Polls.

Most polls also include a margin or error, a +/- figure that's a measure of the pollster's confidence that the sample accurately represents the whole population. The larger that margin of error, the less accurate the poll.

Today's electorate is also more diverse than ever before, and many polls don't reach the sample populations that reflect this diversity, especially if there are language barriers.

Additionally, it's important to remember that pollsters with specific agendas can easily manipulate how they conduct polls in order to produce outcomes favorable to their interests (a conservative polling firm typically produces results that skew conservative, and vice versa for a liberal polling firm). Doing so can make candidates or issues appear more popular than they actually are, and ultimately influence voter decisions. Which is why, as a consumer of polling data, it's so important to pay attention to who conducted the poll, why they conducted it, how they conducted it and what questions they asked.

These "20 questions a journalist should ask about poll results," published by the NCPP, is a good guide for helping to decide if a poll is worth its weight.

The data news site FiveThirtyEight continuously updates its election predictions by aggregating hundreds of poll results. Each poll is weighted based on a rating system that considers the pollster's methodology and track record.  Check out its rankings of some of the major polling firms here.