Harry McGee: Eight of out 10 politicians have mixed views on polls

Opinion polls may be unreliable but that does not stop us all from relying on them

Eight out of 10 cats prefer a certain brand of petfood, according to the famous advertising campaign. The survey was ludicrous, of course, but it hasn’t stopped the preposterous finding becoming a stock phrase in the arsenal of popular culture.

Surely you could not compare the premise of that “survey” with a political opinion poll, now could you? We are talking white coats and laboratory conditions here, aren’t we? Polling companies interview a large sample of people (typically about 1,000), who are representative of the population in terms of age, sex, occupation, location, and social class. That allows them to calibrate public opinion to within plus or minus 3 per cent. Sounds impressive. And they could never produce findings that were ludicrous or rogue. Or could they?

Well, erm, they do from time to time, if truth be told, and much more frequently now than in the past.

We are now entering election season and there will be a glut of polls between now and the end of February. To the media they are catnip: eight out of 10 political reporters prefer opinion polls to any other form of political fodder. Politicians are also obsessed by them despite protests to the contrary.

READ MORE

The public also increasingly rely on them in an increasingly distracted age. It is easier to digest politics as a sports league table or X Factor judging decision than to have to wade through the swamp of boring – albeit important – policy differences.

Powerful tool

They have become so powerful a tool that they can determine, rather than reflect public opinion. Back in 2001, a Fianna Fáil-led government tried to ban opinion polls in the last week of an election following its candidate being defeated in the Tipperary South byelection. Fianna Fáil blamed the loss on an opinion poll published days before polling day.

As so many powerful players buy into them, there is an acceptance of accuracy that is not borne out by the evidence. The media and pollsters have not always been forthcoming in the way they have presented findings. The flaws of polls have been downplayed and the findings overplayed.

A headline will proclaim: "Fine Gael support increases in latest opinion poll." When you look to see by how much, you discover it is by 2 per cent. That is within the margin of error of 3 per cent. Statistically, the result has not varied from the previous one. "Fine Gael support is more or less static as is that of all other parties" might be a more accurate headline, but it is a lot less compelling.

In the 1930s, American journalist and academic George Gallup created the notion of a scientific sample. His company interviewed a segment of the population that was a microcosm of society, reflecting the full demographic and geographical spread. That has been the staple since.

Initially, the public were supportive. But over time, and increasingly in the last decade, pollsters find it hard to get people to respond (ie willing to take the poll). Once the response rate was about 80 per cent. Now it has typically fallen to less than 10 per cent in the US. That has increased the amount of leg-work, but has also forced pollsters into weighting for non-responsive cohorts. That is an art, not a science, and an unreliable one at that.

There are two other problems that are insuperable for pollsters. People tell lies to researchers, be it face-to-face, on the phone or in internet polls. It is also impossible to differentiate with any accuracy those in the sample who will carry through on their opinion and those who will not.

It had been a habit of the Referendum Commission to carry out polling surveys after the final result of referendums to determine what influenced voters. This year after the same-sex referendum, it declined. "Past experience has shown that public responses after a referendum frequently result in those polled providing inaccurate data as to the level of claimed turnout and how the respondents claimed they voted. Accordingly, post-referendum polling may be misleading," said its chair High Court Judge Kevin Cross.

In other words, people who have not voted say they have voted. Typically, more than 90 per cent say they voted when the turnout is two-thirds of that, at most.

One should be especially wary of mid-term polls. Not only do many respondents not know the answer, they don’t even know the question. They have not thought about it, so the answer is meaningless.

The proof of the pudding is when the real poll takes place. There has been much soul-searching by British polling companies after they got the 2015 general election result so wrong. The finger has been pointed at “shy Tories”; at “herding” (where polling companies adjust their figures to align with others); at the late “plumpers” (who decide only when they go to the polling station); and at those who said they would vote but did not.

Crude instrument

In Ireland, there are some things that should be borne in mind. For one, opinion polls are a crude instrument. It is like being asked to figure out what's behind an opaque window that's also cracked. That said, the closer you get to polling day, the more accurate they become as people are more engaged. The Ipsos MRBI poll in the final days before the 2011 election was right on the money. But then when you scrolled back to the previous September, Labour's support, according to the same pollster, was at a dizzy 33 per cent, which was never a true reflection of its support.

Distortions are caused by lower response rates; by flawed guesswork about which groups won’t vote; and by getting people to respond to a question on which they have no view. So why do we still rely on them so much? Seemingly, eight out of 10 cats still prefer them.

Harry McGee is Political Correspondent of The Irish Times. Stephen Collins is on leave