The Power of the Polls

by MEREDITH O'BRIEN
The Quill, 87 (November 1999): 22 
       

It all started with the Clinton-Lewinsky scandal in the fall of 1998. When opinion poll after opinion poll said Americans thought President Bill Clinton was doing a good job despite the fact that he lied under oath about his affair with White House intern Monica Lewinsky, there was a great hue and cry on the radio and television talk show circuit. "It couldn't possibly be true," incredulous callers would say. "Who are those people polling? Just Democrats? They can't be calling anyone I've talked to."

A number of political pundits and columnists, including syndicated columnist Arianna Huffington, began questioning whether these polls were accurate. Huffington - who has been conducting a "hang up on the pollsters" campaign in her columns and on her web site for months - criticized both the politicians' and the media's reliance on polls as a substitute for real leadership and real news.

In November 1998, Huffington complained that it is politically destructive for the media to gauge candidates' strength through public opinion polls. "Lengthy articles are written about such horse race polls, which are then circulated by handlers and fund raisers to convince donors and [Political Action Committees] that the other candidates are already out of [the race]," she wrote. "Snapshots harden into portraits; predictions become coronations."

Huffington used a lot of ink analyzing the polls about the Clinton-Lewinsky scandal and came to the conclusion that: "When the history books are written, the Clinton crisis will be the first political crisis to be so entirely driven and shaped by polls. ... If the polls are going to be the instrument by which one will judge the fate of this president, it becomes all the more important to answer the key question: who is talking to the pollsters and who isn't?"

Former Vice President Dan Quayle said he didn't think pollsters were talking to the "right" people about the Clinton sex scandal. "The public is more upset with President Clinton's behavior than polls indicate" he told the Associated Press in the fall of 1998. "I don't buy these polls that somehow, people don't care. They really do care."

Questions about the sample size and who wasn't participating abounded. Given the rise in annoying telemarketing calls, prompting more people to screen their calls, weren't certain groups simply selecting themselves out of the process, people asked. Didn't that mean the polls weren't reflecting the full spectrum of opinions?

Syndicated columnist Daniel S. Greenberg jumped aboard the bandwagon, taking a shot at pollsters and the media's promotion of polls in a Washington Post piece, saying, "Political polling is a profit-seeking business that depends on a gullible public for the benefit of its paying clients of politicians and media - the former sniffing for votes and the latter peddling poll results in the same way that they do baseball scores."

On the cusp of the 2000 presidential election and the impending plethora of media polls gauging the horse race aspect of the contest, pollsters are saying while an increasing number of people refuse to participate in polls, polling data are still solid and the media shouldn't worry about reporting them. But is there a problem? Considering that major political and public policy decisions hinge on polling results, should the media, which commission and report the data, be concerned?

The Elusive Democratic Pulse

Modern scientific polling - interviewing a relatively small number of randomly selected respondents to reflect the opinions of the general population - is thought to have begun in 1935 with George H. Gallup, the founder of one of the best known American polling firms. He issued a challenge to Literary Digest, which normally conducted a mail-in poll of its readers, that he could more accurately predict the winner of the 1936 presidential election with his "new" method. After talking to a randomly selected pool of Americans, Gallup made his prediction and it paid off big time. He picked Franklin Delano Roosevelt to win with 54 percent, while the vaunted Literary Digest, using its distinctly unscientific poll of two million readers, predicted Alf Landon would win with 57 percent to FDR's 42 percent. (Literary Digest was far off the mark. FDR garnered 61 percent.)

While pre-election polling has existed in one form or another for at least 100 years, "scientific" polling forever changed political campaigns, according to David W. Moore, author of"The Superpollsters: How They Measure and Manipulate Public Opinion in America."

"The 1970s witnessed the rise of the campaign pollsters, and nowhere has their influence been more pronounced than in presidential campaigns," Moore wrote. "They have helped to transform the electoral process, although not by most accounts for the better. They have enhanced the ability of candidates to manipulate public opinion and the press, and to shape messages for maximum effect."

In addition to the increase in the number of campaign-generated polls, there has been a similar spike in the media's use of its own polls - largely to break news and garner publicity, Moore says. Though media polls have yielded "a better understanding of the dynamics of public opinion," he wrote, "There is still a tendency for media polls to create the illusion of public opinion, by asking forced-choice questions on some topics that are unfamiliar to most people."

Despite the scientific nature of telephone surveys, any number of factors can taint a poll guaranteeing that none will ever be completely accurate, Moore cautioned, which is why sampling error rates are distributed. "The views that people express in polls are very much influenced by the polling process itself, by the way questions are worded, their location in the interview, and the characteristics of interviewers," he wrote. "The pulse of democracy, it turns out, is much more elusive than Gallup had surmised."

Refusal Rates on the Rise

And that pulse is even more difficult to take when record numbers of people refuse to participate. The "average" refusal rate for telephone polls depends upon whom you ask. Some pollsters say a refusal rate of 30 percent is not unusual, while others say that 40 percent is about right.

However, a four-year-old non-response study assessing the public's cooperation with pollsters found about 60 percent of the 1,920 people contacted refused to participate. The Council for Marketing and Opinion Research (CMOR), a lobbying group which represents pollsters and surveyors, did its telephone poll in April 1995 and found that the refusal rate had risen 13 percent since 1990 in what they called "a slow but steady upward climb."

In 1997, CMOR also did a refusal rate audit on 385 national surveys and found 46 percent of those contacted would not participate, according to Jane Sheppard, director of CMOR's respondent cooperation. Compared to a similar audit in 1988 that had a 40 percent rate, refusals had increased 15 percent. "If we decline one percent a year, there's concern about the validity of the research we're doing," Sheppard said.

The real culprit behind the refusal rate increase, pollsters say, is the telemarketing industry. Telemarketers hound people day and night, driving them to buy call screening devices, or so the argument goes. It's the telemarketers who people are trying to avoid, opinion surveyors say, making it more difficult for pollsters to get around the call screeners.

Diane Bowers, executive director of the Council of American Survey Research Organizations (CASRO), a trade association, said while call screening has had a "negative impact" on the polling industry because people aren't sure who is or isn't a salesperson anymore, the up-tick in telemarketing calls really concerns her. "It has undermined the legitimacy of research," Bowers said.

"Telemarketers are going to kill the whole polling industry," said Allan Baldinger, vice president of the NPD Group and one of the authors of CMOR's refusal rate study.

While pollsters complain about problematic telemarketers, they argue that their data is fine. The real cost of the upwardly spiraling refusal rates is monetary; pollsters have to call more people in order to get a well-rounded sample of public opinion. "I think there is a slow erosion in cooperation, but there's very little evidence that you can't do a good job of getting a representative sample," Baldinger said.

John Gorman, president of Opinion Dynamics, a polling firm that does polls for Fox News, echoed Baldinger's sentiments. "At least so far, and this is a constant concern, the non-response rate problem does not threaten the accuracy of the data," he said.

Frank Newport, the editor-in-chief of The Gallup Poll which works with CNN, said though the decreasing participation rates are worrisome, once pollsters get through the call screeners, eliciting opinions is relatively easy. "Telemarketers have made it more difficult," Newport said. "But people love to talk about [politics], if you can get them for 15 seconds."

But Peter Tuckel, a sociology professor at Hunter College at the City University of New York, has done numerous studies on the impact of call screening mechanisms on polling and says telemarketers aren't the only reason refusal rates have been climbing. "There is a general decline of civic-mindedness," Tuckel said. "We're not as participatory as we used to be." As voter turnout continues to plummet, so does the public's willingness to proffer opinions on political issues, he said. "I'm always amazed ... why political polls are as accurate as they are," he said.

Harry O'Neill, vice chairman of Roper Starch Worldwide, a market and opinion research group and former pollster for President Richard Nixon, attributed some of the refusal rate spike to overburdened, two-income families with less time. "When you ask people why they don't respond the reason they come up with, usually, is that it's an inconvenient time," he said.

Who is hanging up?

But when Quill asked members of the polling industry to paint a portrait of those who refuse to participate, the picture is strikingly different depending on who one asks. Though pollsters may agree on how to make sure everyone is represented in the polls - by making more phone calls - they don't agree on whose opinions are most often missed.

According to a spring 1995 study in Marketing Research coauthored by Tuckel and O'Neill, non-participants tended to be elderly, male, less educated and have fewer political and social activities. Their study indicated that people who screen their calls aren't necessarily less willing to participate in polls but that reaching them is becoming tougher.

But the 1995 CMOR non-response study said that of those who participated in their survey who reportedly refused previous polls were largely young, well-educated, richer and white respondents. Two-thirds of their participants said they owned answering machines - mostly younger, well-educated, white and more well-off respondents - and at least half use them as call screeners. The study's authors did acknowledge that it's difficult to determine who isn't participating in polls and why with - of course - another poll.

Opinion Dynamics' Gorman describes the typical non-respondents to surveys differently. "A certain number of people are hiding behind caller ID and answering machines," he said, but the non-responders "are the same people who won't participate in anything," including elections. Younger males, the very wealthy and the very poor tend to be missed, he said. The young men are tough because they tend to have caller ID and answering machines, and because, "They're too busy drinking beer and playing softball," Gorman said. The very wealthy screen everything because they're targets for telemarketers. Meanwhile the very poor tend to be "suspicious" of surveying calls.

Even though it's hard to reach these groups, Gorman said it's the job of a responsible pollster to tap them, even if it takes a few more phone calls to do it. If the polling firm makes enough calls and gets a representative sample, the data will be fine, he said. "So far, [increased refusal rates] do not appear to have an impact" Gorman added.

In yet another opinion, Baldinger said he has yet to find certain specific groups underrepresented in public opinion polls because they refuse to cooperate. "There may be demographic pockets that refuse to participate," he said, adding he feels comfortable with the data because they so closely mirror election results. Pointing to polls before the 1998 congressional elections, he said, "The numbers were amazingly accurate."

Tuckel's theory on how the surveys remain reliable despite the fact that certain segments of society aren't necessarily being reached by pollsters? The people who refuse to partake in polls are the same ones who don't show up to vote. If one compares the 1998 pre-election polls to the election results, he said, the numbers are similar.

Newport agreed. "There's no demonstrable bias" the Gallup editor said. "... To date, we still have confidence in the telephone survey."

But critics don't buy it.

"It doesn't matter how much you increase your sample by" Huffington said, adding that certain people are simply selecting themselves out of the process. Only "lonely Americans" with nothing better to do talk to pollsters, she said.

She pointed to Jesse Ventura's gubernatorial victory in Minnesota last year as the case in point as to how the polls can miss large segments of the population: "Every poll had him losing. What does that say about scientific polling?" Some candidates can galvanize groups who typically don't participate in the political process, like Ventura, but their viewpoints aren't reflected in the polls. "How is their opinion going to be captured?" she asked.

More media analysis required

Humphrey Taylor, chairman of the Louis Harris & Associates polling group, said because the media do not analyze polls well -by failing to explain things like the refusal rate, what attempts were made to pursue non-responders and other factors that might have skewed the results - the public can be ill-informed.

"For most of the media, a poll is a poll," Taylor wrote in the March 29 issue of The Polling Report. "All that matters is that it be 'newsworthy'- which tends to mean sensational. Quality and reliability are non-issues. Badly conducted polls (with 'wrong' results) can often be more surprising and therefore more newsworthy than a well-conducted survey."

If the media are not careful in vetting the polls they commission and report on, the public will be getting a warped view of what it supposedly thinks. "If we wanted to commit polling mayhem, nobody would notice," Taylor said.

The Wall Street Journal's editorial page also ominously warned in a March 3 piece that the media need to be wary of presenting the public misleading polling data without explaining the complicating factors. "Polls have their informational uses, to be sure," the editors wrote. "But it's remarkable that they're so little analyzed for the public. ... The time has arrived to assign a knowledgeable critic to make more sense than we've got now of the 800-pound gorilla in the room - the opinion polls."

When confronted with accusations that the polls don't really provide a true glimpse into American public opinion, pollsters scoff that the ones leveling those attacks are simply ideologues who didn't like the polling results.

"Polling per se, [has come] to be somewhat of an emotional thing" Newport said, attributing much of the criticism to anger at the data.

O'Neill agreed, saying that pollsters can't argue with critics like Huffington - who he called "an idiot"- because their commentary is based on promoting a particular political viewpoint. "If you present: a logical argument to them, they say, "This is just an excuse you are making;" he said. "Their criticisms are ideological. [They say,] 'These polls don't represent my point of view so the polls are wrong.'"

The national polls conducted during the fall of 1998 showed "virtually the same results" regarding President Clinton's high approval ratings despite the impending impeachment vote, O'Neill said. If there was a consistent group of people whose opinions weren't registered, that exact same group would have to be missed in every other telephone poll taken during that time. "And if you believe that, I've got a bridge I can sell you," O'Neill said.

But Huffington said her critique of polls, particularly regarding the voices that are left out, has nothing to do with ideology. "I think it's absurd to say it is ideological," she said, noting that she lambastes both Democrats and Republicans.

A major concern expressed about the accuracy question stems from how much rides on polling data. The fate of politicians' futures rest in the media's reporting of polling results, as do many public policy decisions and the credibility of the media outlets themselves. Former Clinton aide George Stephanopoulos' recent memoir, "All Too Human," drives the point home. "Taking a poll, to me, was like taking our temperature, and I had advocated the relatively inconsequential middle-class tax cut in 1992 not only because it symbolized whose side we were on, but also because it scored well," he wrote. "But during the [Political Consultant Dick] Morris era, it seemed more and more as if we were polling first, proposing later."

When it came time for the president to decide how he should respond to the Lewinsky allegations, Stephanopoulos said: "At his moment of maximum peril, the president chose to follow the pattern of his past. He called Dick Morris. Dick took a poll. The poll said lie. It was out of Clinton's hands."

Huffington said she has heard people muse both publicly and privately about the impact of what she sees as unrepresentative polls on real political decisions. "We should not allow our politics to be taken over by pollsters" she said. "Leadership is being replaced by polling."

And with the media increasing the amount of money and prominence given to polls, Huffington said editors and producers should be wary.

Patti Boesch, survey coordinator for The Dallas Morning News, which commissions four to six polls annually, said her newspaper is aware of the refusal rate escalation nationwide but doesn't know "specifically if it has affected our research."

Carol Young, the deputy executive editor of The Providence Journal, which typically publishes two of its own polls before every general election, was surprised when she learned about the average refusal rates. "We're very sensitive to the power and impact of polling," Young said. "You need to pick your pollsters carefully."

Other news organizations - The New York Times, The Washington Post, CBS News and ABC News - didn't return phone calls about their polling procedures. CNN Spokeswoman Kelly Keane referred comments to the Gallup Organization.

To do a better job at analyzing polls, pollster Taylor offers several suggestions including: Ask how many days on which was it conducted (the longer the better), how many attempts were made to reach randomly selected participants, and what steps were taken to minimize refusals. With the right questions and responsible pollsters, Taylor said the media can do a good service. "... The better polls have achieved remarkably accurate results in presidential and other elections with real response rates of less than 40 percent," he wrote.

Gallup's Newport, who said he has complete confidence in the accuracy of polls, said he's pleased with the increase in media polling. "I think that's great," he said. "... In a large democracy, 200 million Americans, the objective is well-done polls that are fairly measuring [opinions]. It affects the course of democracy."

Even polling critic Moore tends to agree that, by and large, polls do reflect the sentiments of the public. "... Gallup's vision of polling as an instrument for improving the practice of democracy seems largely vindicated," he wrote. Despite its faults, Moore said, "polling can, indeed, provide a continuous monitoring of the elusive pulse of democracy. More or less."