When teenage pregnancy rates in one community fell drastically, it looked like a policy of sex education had paid off. But, as Michael Blastland explains in his regular column, the bumps were still there... you just had to know where to look.
Can stories be bad for you? Let me tell you a true one.
Once upon a time, Orkney had one of the highest teenage pregnancy rates in Scotland. Scotland itself is high in the international league.
So health workers in Orkney tried something new. They began talking to young people about sex in terms of relationships, not only mechanics. They also made condoms easily available because in a small community the shopkeeper might just be your auntie.
Then came data showing that Orkney's teenage pregnancy rate had dramatically halved.
All this was widely reported last summer. Convinced of the happy ending?
Let me introduce you to a radical and highly complex, story-wrecking mathematical insight. Ready? Numbers go up and down.
All right, I lied about the complexity. I also lied about the maths. This is not really mathematics, it is everyday life. In life, things do not happen with perfect regularity. Some days, or years, there are more, sometimes fewer. And it's not radical either. Everyone knows it.
Until they tell themselves stories.
The truth in the Orkney case is that the number of teenage pregnancies goes up and down, and ups and downs may have nothing to do with the stories told to explain them.
When new data came out since last summer, it was not reported. Or at least I can't find any reference to it. Perhaps it didn't fit the story.
Here is the data in a little more detail.
The figure reported last summer, the most recent then available, really was lower than in 1994. But it's clear that the numbers go sharply up and down, much more so in a small community than the larger one of Scotland as a whole.
What happened when more recent data came out? This.
The same as usual. What had gone down, briefly, went up, just as what sometimes jumps up often tends to come down.
You can talk about life, relationships, morality - and so we should. But there's another knowledge of life that statistical thinking sees and other mind-sets often miss. This second kind of thinking relies not particularly on maths, but imagination, imagination for what can go wrong with narratives that describe the way one thing leads to another. This is the best kind of story-telling; story-telling wise to the ways that stories might mislead.
I'm quite sure that no-one lied in telling the story in Orkney. I think they were just wrong. I suspect that all concerned, including journalists, found the story of relationship advice and condoms plausible enough to convince them that the numbers they looked at in those two salient years - a beginning and an end - told them something important. And the data does not prove that their new approach to sex education is wrong. It might have benefits that these statistics don't capture.
A worrying time - for statisticians as well as amorous teenagers
Nevertheless, a plausible explanation for change helps convince people that the change really occurred. A plausible story for why it happened persuades us that it did. The explanation becomes the story.
But "it happened because
" can disguise the fact that it didn't really happen at all, or at least not the way we think it did. The truth is that we still don't really know if there is an underlying change in the pattern of teenage pregnancies in Orkney. There might be. But it's not evident yet.
Another recent example appeared in an editorial in a serious national newspaper - which had better remain nameless - about the revival of marriage, with a reflective account of why this had happened deduced from a short run of recent data. Five weeks later the same newspaper found itself reporting that marriage had in fact, according to new data, fallen to its lowest level in one hundred and eleven years.
Another is the way that both sides in the climate change argument have seized on single-year fluctuations to the extent of arctic sea ice as "proof" that trends are going further, faster, or in the other direction, or whatever. Anyone might guess that such numbers go up and down.
But even serious people ignore or forget this in the haste to tell a story. The real difficulty in almost all these cases is to work out how long you have to look at the data before being confident that change is sustained.
Observing the excitability at life's natural yo-yo, you might wonder if parts of politics, journalism, even sometimes science, resemble nothing so much as an insane commentary on a game of (horizontal) tennis, in which it is assumed that whatever just happened tells us all we need to know, as if whoever just hit the ball must be winning.
The underlying trends and often slow nature of real change are lost in a frantic effort by all sides to grab at any short-term snippet of data and claim support for their beliefs or policies, to tell stories with an instant moral.
Here's a more general moral: things go up and down. Storytellers take note.
Below is a selection of your comments.
It's not just plain randomness, there could also be Regression to the Mean making it worse. A particularly bad year for teenage pregnancy in Orkney leads to a demand for "something" to be done. So the point where we start looking at the figures has been artificially selected as a historically high value. Not surprisingly, there is a very good chance that the figures return to something more like their long term average in the following years, and it then looks like the "something" has worked, even if it has had no effect. A similar effect happens when people resort to quackery (miracle diets, cold cures, homeopathy, prayer, etc) at the lowest point of an illness. Some people's condition randomly improves, and the witch-doctors point to it as proof that their method works.
Thank goodness - an article that shows, with a "simple" example, how interpretation of short term data and/or data smoothing can generate totally the wrong conclusion. I suspect we need much, much greater knowledge of the process of evidence-based science throughout the population coupled with a vast increase in the number of "scientific" politicians and journalists before we ever smooth out the yo-yo! (and pigs may fly!). Good article and great to see this awareness, at last, coming from the BBC. Hopefully, this will be applied to all aspects of Climate Change from now on!
James Tweedie, Dundee, Angus
What? You write an article on whether data is statistically significant and you don't mention the sample size? Not necessarily disagreeing with your conclusions, and certainly not contesting that data from small samples can fluctuate wildly without being indicative of any sort of trend, but this is a very, very basic omission. Poor science. For the record, 30 seconds' research suggests Orkney has a population of around 20,000 today (may have been more or less in the past, but let's go with that as a ballpark), of which we can probably assume around 10,000 are women. If your graph indeed shows pregnancies per 1000 women and not per 1000 *teenage* women, that means the raw numbers are fluctuating between around 600 and 220 pregnancies per year. I'm not a professional statistician, but my mathematically educated intuition tells me this is statistically very significant indeed, and highly unlikely to be caused by random fluctuations in data.
Statistics are by nature very variable, it all depends on how you collate the information and how longer a time period you look at it over as Michael Blastland's article quite clearly showed. So beware statistics, because you can manipulate any series of figures to prove your own point!
Keith B, Leicester
This reminds me of an experiment I saw on TV (probably horizon or something) where people were asked to look at a pair of pictures and decide which was most attractive. After they had done this with lots of pairs, they were shown the one they had chosen of each pair and asked to justify their decision, which they happily did. Except that the scientist had swapped the rejected photo for the chosen photo, so the subject was happily justifying a decision they had not actually made. In some cases the subject spotted this but most went along with it. It showed that we often make a decision and then seek to rationalise it by looking for evidence that fits that decision.
Nick M, Nottingham, UK
The British government abuse statistics in this way to impose and justify revenue-generating speed cameras. They take a blip in accidents at a given spot as excuse to install the camera, then when the accident rate 'drops' they claim that it is due to the presence of the camera while, in reality, it is just the accident rate returning to normal.
Steve, Bristol, UK