Tuesday 27 April 2010

Measuring public opinion

I think this article is pretty much on the button in raising some concerns about the dominance of opinion polls in the current election and the potential pitfalls involved in trying to use them to second guess the outcome. This is priceless:

Page, however, chose to inform the Radio 4 listeners, without any sense of irony, that six out of 10 voters tell pollsters that they pay no attention to the polls. But here's the problem: 10 out of 10 journalists do.

Of course there are very good reasons why apparently similar (even more or less identical) survey questions can, as Hasan points out, generate really quite different response sets. Was the survey telephone, face to face or was some other method used? Were people contacted at home, via mobile phone, or via the internet? What time of day was it when the interviews were carried out? Even more important are the technical issues: what was the sampling frame? How were the raw data weighted? What was the response rate like?

Now, aside from the issues raised by Hasan - and they probably are quite important, for example if people start to base tactical voting decisions on erroneous polling data - in terms of the election we'll ultimately get the real answer from the election poll itself.

But my interest is with the use of opinion survey data, in the form of the British Crime Survey, being used to judge the performance of the police. Obviously the BCS is not some two bit survey conducted on a wet Thursday afternoon by a small research company. All the issues raised above and many more have been taken into account when estimates such as those linked above are produced. But when Lincolnshire (for example) is given a little down arrow - and implied black mark - on the basis of a 4.9 per cent fall in positive views year on year (see Table 1), what does this really mean? More precisely, perhaps, it surely means something - but is it fair to judge the performance of a police force on the back of such a relatively small shift in public opinion bearing in mind some of the provisos raised above?

Don't get me wrong here. Investigating public assessments of police performance is a useful and worthwhile endeavour, both in and of itself, and because, over relatively long periods, change in public opinion is likely to provide an extremely useful barometer of how well the police are doing both in general terms and in relation to specific elements of performance (dealing with issues of racism within the force, for example). What people think about the police is important and we should be measuring public opinion on this topic. But I'm much less sure that year on year change in point estimates of opinion as measured by single questions can provide much useful information.

Even worse, such an emphasis may generate perverse incentives by shifting manager's attention away from what really matters (which I would claim to be most importantly the relationship between police and community and how fairly people feel treated by police) toward trying to address apparent local problems in 'dealing with crime and disorder'. 'Problems' thrown up, remember, by perhaps 5 per cent more people holding negative views this year compared with last.

No comments:

Post a Comment