NZ public poll methodologies
Andrew at Grumpollie has put together this very useful table showing the different methodologies of the five public pollsters (I don’t count Horizon) in NZ.
Russell Brown at Public Address noted:
One big advantage for the political Left of John Banks’ sorry experience with the courts last week is that it meant people weren’t talking about the Left’s really awful result in the latest Roy Morgan poll.
Morgan has National up seven points to 52.5% support, and Labour and the Greens both down to a combined 38%. The Greens shed 4.5 points to slump to 9% support, their lowest level since 2011. This will hurt at The Standard and the Daily Blog, where Roy Morgan polls and their inclusion of mobile phones are something of an article of faith.
It’s possible that this is an outlier poll — it does, after all, show Act doubling its support — but while Gary Morgan’s commentary on the results is typically bonkers, there’s nothing in particular wrong with the company’s methodology. And, significantly, the swing is reflected in the regular Government Confidence Rating (whether New Zealand is “heading in the right direction” or not.) It simply looks like a very healthy post-Budget poll for National.
But a friend put another interpretation to me on Friday: that the public has had a look at Internet-Mana and decided a potential centre-left coalition is really not to its taste. Perhaps Labour has internal polling to similar effect, explaining the spluttering reaction of of a number of Labour MPs to the prospect of cooperating with the party of Kim Dotcom and Laila Harre.
The commentary on the Roy Morgan polls is generally hilarious, and somewhat removed from reality. This doesn’t mean their polls are inaccurate.
However what Andrew’s table shows is that we know very little about how they conduct their polls – which would help people make a judgement on reliability.
The other four pollsters have signed up to the NZ Political Polling Code. This requires signatories to publicly release significant aspects of their methodologies. This is an important step for transparency. Roy Morgan has not signed up to the code, and we don’t know a lot about how their polls are done. We don’t even know if they weight the polls to the NZ adult population.
This doesn’t mean their polls are wrong, just as it doesn’t mean pollsters who have signed up will always get it right. For example a poll I did on attitudes to smoking and lung cancer found a lower prevalence rate for smoking than the census. Now the census figure is almost certainly the more accurate, so the difference may be down to how people respond to a phone poll vs a census, or it may be that even with weighting we under-surveyed current smokers. Good pollsters will always be critiquing their own methodology and considering how to enhance or review it.
It would be a very good thing if Roy Morgan did release more information on their methodology, so people can understand their results better in the right context.