On April 17 the UK House of Lords Committee on Political Polling and Digital Media released a report on polling in the United Kingdom. The Committee took action in response to what it perceived as polling errors in two UK general elections and in the Brexit referendum.
While you can read the full report here, we asked Adam Philllps, who has chaired the ESOMAR Professional Standards and the Legal and Government Affairs Committees, to give us his assessment of what the report means.
The House of Lords decided to investigate the state of political polling in the UK after what was widely believed to have been a serious failure of the polls to predict the outcomes of the last two general elections and the referendum vote on whether to leave the European Union. Their conclusion was that:
“Analysis of political polls conducted since the 1940s does not show that polling has become more inaccurate over time. However, the three high-profile failures of polling in the UK in the last three years….raises the possibility that things might have taken a turn for the worse.”
The Committee came up with a number of recommendations for reducing the risk of this happening. However, they saw no good reason to restrict the publication of polls before election day because they decided that imposing such a restriction would increase the risks of misinformation and leaks of private or foreign polls. This could create fake news that might mislead the electorate.
As is usual with House of Lords committees, the investigation was more thoughtful and less focused on sound bites and political point scoring than committees of the House of Commons. The Committee of 12 Lords was chaired by Lord Lipsey, a former deputy editor of the Times newspaper; it held 23 evidence sessions, received 31 submissions of written evidence and heard from 40 witnesses.
Its terms of reference were to consider the following themes:
- Polling methods and accuracy
- Regulation of political opinion polling
- Social and digital media coverage of polling—quality and impact
- The influence of social and digital media on political debate
During the enquiry, which reported just as the Cambridge Analytica/Facebook scandal was breaking, it wisely decided that the fourth theme was too large and complex a topic to cover as part of this inquiry.
At the outset the Committee decided to define a political poll.
“…..ambiguity in terminology is a particular problem for members of the public and the media, who have no obvious way of checking the quality of polls or surveys, and this is an issue which we return to in Chapter 3. However, for the purposes of this report, we use the following terms:
- Voting intention poll: This refers to pre-election polls or surveys which aim to gauge how people intend to vote at any one time or in a particular election. For example, such polls might ask which party the respondent intends to vote for in a General Election, or which option they intend to choose in a referendum.
- Policy issues poll: This refers to polling or surveying undertaken to assess people’s views on issues that might relate to social policy or politics, such as views on the NHS, fox hunting or Brexit, but which do not involve estimating voting intention. This can include opinion polling and surveys conducted on behalf of advocacy groups and are often aimed at influencing public policy.
- Exit poll: This is a poll conducted of voters as they leave the polling station.
- Private poll: This refers to the polling or surveying undertaken by political parties, individuals, or private and public companies, where the results are only selectively released to the public.
- Informal poll (sometimes called a ‘snap poll’ or ‘straw poll’): This refers to a poll which has been conducted without using robust sampling techniques and where the representativeness of the sample is questionable. An example of this would be a newspaper running a limited poll of their own readers on an issue. There is nothing inherently wrong with this approach unless the poll findings are presented as being representative of the wider population.
- Social survey: This term refers to more comprehensive, longer-running exercises conducted by governments, independent research agencies, academics and think tanks to measure public attitudes on social and policy issues (for example, the Ipsos MORI/Economist’s Issues Index that monitors the public’s perception of the big issues facing the UK every month, or NatCen’s British Social Attitudes survey that asks the public what it is like to live in Britain). We did not examine this type of polling as part of our inquiry.
These definitions are not intended to be exhaustive, or to describe the quality of polls. Issues such as the representativeness of samples, and the presentation of poll findings, are explored in some detail later in this report.”
Inevitably, the Committee focused more attention on voting intention polls and the desire for speed, low cost, and reporting an exciting horse race that shapes them. It spent some time grappling with the issue of who finances polls and the motives they may have when designing the questions and reporting the results. The Committee wants to see a lot more transparency in this area.
The Lords were sympathetic to the problems researchers face today: falling response rates, and the changes in society making it difficult to stratify and weight samples. The report states:
“socio-economic class is no longer a good predictor of voting intention and voter volatility”.
“UK voters have become more willing to switch between parties”.
The trend of declining turnout in elections has been a growing problem, since this makes identifying the people who will actually vote a lot more difficult. The Committee did not come to formal conclusion about the impact of the internet, other than to say that the reduction in cost per interview which the internet has introduced has made it possible for a lot more polling to take place.
The Lords were presented with a robust defence by the pollsters, by WAPOR (the World Association for Public Opinion Research) and by the Market Research Society who said (among other things) that all the evidence points to a very limited impact of polls on voting behaviour, an outcome the Lords worried about.
The Committee concluded that, although analysis of the accuracy of voting intention polls since the 1940’s does not show that polling has become more inaccurate over time, the three recent failures in the last three years have dented public confidence. It noted that it was possible that this is the beginning of a trend for increasing failures in the future. The Lords expect polling organisations to continue to innovate and want the industry and others to continue to conduct critical enquiries so that the reasons for failure can be better understood.
The Committee was critical of media reporting of polls and want to see the polling industry and others introducing training for journalists, they want better provision of technical information about polls in the media and more guidance about the reliability of the results. They want the British Polling Council’s Journalist’s Guide to Opinion Polls, which is based on ESOMAR FAQ’s about polls, extended to include;
“….an authoritative definition of what constitutes a properly conducted poll (as opposed to a small unrepresentative survey), and a list of criteria which must be met for a survey to be recognised as a poll.”
The Committee considered and rejected the idea of statutory regulation of polls based on the model of the CNIL (Commission National de I’Informatique et de Libertés) in France. However:
“if the media reporting of such polls continues to influence public and political discourse in a misleading way, then arguments by supporters of a ban would be strengthened.”
Moving beyond voting intention polls to consider the impact and regulation of other published polls the Committee concluded:
“There are other types of polls which affect political discourse in the UK, such as those that measure public opinion on political and social issues. We found that some of the key problems we identified for polling, particularly the use of leading questions and misleading presentation of results, were more pronounced for policy issues polls. We feel that there is a clear need for more oversight of the conduct and reporting of such policy issues polls.
The Lords asked the British Polling Council to publish examples of particularly poor media reporting on polls and asked the polling companies to state publicly where they think their polls have been misused or misreported. This latter suggestion is likely to be ignored unless enforced by professional associations because of the commercial pressure that can be exerted on polling companies by the media. Publicly questioning the validity of poor polls is a vital role for research associations, like the MRS and the BPC, cooperating with existing independent bodies that regulate the media, like IPSO (Independent Press Standards Organisation) and Ofcom.
“Numerous polls are conducted every week which affect political discourse in the UK. In some cases, there is a failure by those who publicise such polls to communicate all of the relevant details about the selection and framing of questions to obtain a desired answer. We believe that most of these examples are deliberate attempts to manipulate polling findings, in order to distort evidence around public policy issues. We conclude that there is a case for the British Polling Council to play a greater role in proactively overseeing the conduct and reporting of polls.”
This House of Lords Committee review is the most authoritative review of the polling scene in the UK since the 1992 Election. That 1992 report resulted in a number of recommendations to improve transparency of reporting of polls, and those recommendations were followed for a number of years (the Sun once reported on a poll where the story was shorter than the paragraph describing the poll) but then attention to self-regulation gradually lapsed as memories faded and industry bodies avoided public battles will powerful vested interests. This time, it is to be hoped that the media and polling industry will support the recommendations of the Committee for effective self-regulation and avoid the need for messy and probably ineffective legislation.
It has always been difficult for ESOMAR, as an international organisation, to comment directly on purely national surveys, unless requested to do so by the national association. ESOMAR has developed good guidance on polling, as well as Understanding and Interpreting Polls, an online training course for journalists jointly with WAPOR and AAPOR. However, I think it could do more to help the general public understand polls. An earlier and longer version of polling FAQs, cited by the Committee, is no longer on the ESOMAR website. I think ESOMAR could also work cooperatively with bodies like the British Polling Council and their equivalents in other countries, where they exist, to speak out publicly and improve enforcement against poor quality polls.
Adam Phillips, 21 May 2018