News stories based on surveys, polls, studies and statistics are everywhere. Wouldn't it be good to have the mental agility to separate the wheat from the chaff? In the first of a weekly series, author Michael Blastland gives some hints at getting to grips with surveys.
It's August. A time not yet of mellow fruitfulness (Keats), but certainly of fruitcake journalism. Among the best (or should that be 'the worst'?) in the media is the steady stream of duff data.
So for some light holiday reading while school's out, here's a short selection of number nonsense, and how to see through it. Each week, I'll pick a theme and a recent example or two, and have a go at pulling them apart.
Lesson one: Surveys.
The story: "Motorists turn to public transport as fuel price bites" - Daily Record
"MORE than three in five drivers are turning to public transport due to high fuel prices, a survey has revealed. The survey by transport firm National Express found 61 per cent of car users are definitely or probably considering using public transport due to the rise in prices at the pumps."
The flaw: Suckered by a press release. This story gives the impression that motorists are leaving their cars at home, en masse, for - guess what? - services run by the company that did the survey. In fact, the surveyed motorists are not necessarily doing anything. They are "definitely or probably" thinking about doing something - which they might eventually do once, or often, or never.
The Lesson: Two points. The survey has apparently bundled together the more and the less inclined, the definites and the probables, so that we have no idea if only 1% of the 61% are definites, or if most are. And it gives no indication of how big a change in behaviour they are considering - every trip… or just one.
The second point is to wonder what it means to say you are "probably" thinking about something. Are they thinking that they will probably do it, but haven't made up their minds? Maybe it means they are not sure what they are thinking, but if they had to guess what they are thinking then it would "probably" be something about petrol. Maybe it doesn't mean anything.
And are they actually doing it, as the headline suggests? No idea.
Press-ganging the middle, or blurring qualitative differences, is a common device, even in more serious surveys. Imagine a survey on the EU. Completely hypothetical, but let's say it asks if you are "pro" or "anti". There are three choices 1: wholly for, 2 wholly against, 3 partly for and partly against. The result is that one third are wholly for, one third wholly against and one third in the middle.
If you want to make it look like a vote against, you report that two thirds ("well over half") were "wholly or partly" against the EU. If you want to make it look like a vote for, you report that 67 per cent were "wholly or partly" in favour.
It's common to try to bundle up the undecided or half-hearted opinions with those who feel more strongly in order to make it look as if you have a clear and passionate majority on your side.
Another problem with surveys that often puts a spanner in the works is who they ask. Take a survey earlier this year that was much in the news and said that one third of girls self harm - a shocking statistic.
Who was surveyed? A cross-sample of the population. Good. How did they do the survey? By internet and text message. Not so good. How many replied? About half. Which half? Maybe those most likely already to have a big interest in self-harm?
The news is chock full of surveys. Many are blatant self-promotion and/or statistical garbage. When a women's magazine reports a survey of the nation's sexual habits, you might wonder whether it's really what the people of Great Britain are up to, or just a good way of selling a magazine. Is it rather a survey of what that select group of readers-of-spicy-magazines-who-like-filling-in-sex-surveys gets up to? Interesting maybe, but - let's put it politely - of limited social relevance.
This survey on self-harm, was more serious, but prone to the same problem. It came up with a number - one third of teenage girls - that was between about 60 and 200% higher than any recognised surveys done before on this issue.
All surveys can suffer from bias - deliberately or not - by picking up or missing too many of a vital group and simply extrapolating - as if everyone in the country is the same as the people who completed the survey. Were they?
Next week's lesson: Counting.
Michael Blastland is the author, with Andrew Dilnot, of The Tiger That Isn't.
Thanks for your comments. A selection appears below.
Thank goodness someone is highlighting this serious issue. It amazes me sometimes how many people accept headlines as fact without taking any sort of critical view on mainstream news articles. It's irresponsible and unethical to distort facts in order to make news. That kind of journalism is dangerous as it sways public opinion and therefore affects decisions by politicians. Remember the MMR debacle. We need more ethical journalists and more sceptical readers, I'm afraid.
I always laugh at quoted percentage figure increases or decreases as they are meaningless without a base number to make sense of them. For example: "Company A suffered a 100% increase in bad debt customers this year." This sounds terrible, but what if the company has 100m customers, had no customers with bad debt last year, and only one customer with bad debt this year? Still a 100% increase, but 1 in 100 million customers doesn't seem so bad after all!
John Smith, Nottingham
Surveys that question around 1,000 people sound impressive, but just where are these people from? Is it a random selection or just those who got on the proverbial Clapham Omnibus today? My statistics lecturer pointed out that an opinion poll needs to question at least 3,000 people given our 60 million or so population to be anywhere near accurate. Even in that case the opinion poll at best is accurate to 1 or 2 percent error, with a 1 in 20 chance of being completely wrong!
John Airey, Peterborough
Excellent article - spot on. I hate these junk surveys. But the media seem happy to reproduce them without criticism or analysis - is this lack of understanding or the need to fill space? What about a "stupid survey" of the week on this website?
It's always fun to point out, when a news report mentions "a million to one chance", that there are actually 60m people in this country and you would therefore expect to find 60 examples of that particular happening.
Pat Silver, Bristol
As someone who works with statistics on a daily basis, surveys reported in the media such as these always make me laugh. I can spot the dodgy ones a mile off. The funniest ones tend to be adverts for women's beauty products like shampoo or moisturiser which shout about how "80% of women noticed a difference". All well and good until you look out for the tiny writing at the bottom of the screen which tells you that only 116 women were questioned. Never a truer word was spoken than by Disraeli when he proclaimed "There are three kinds of lies: lies, damned lies and statistics". He was obviously well aware that 78% of statistics are made up on the spot.
One of the most interesting articles I've read.
Joe Sak, East Lansing, MI, USA