[an error occurred while processing this directive]
BBC News
watch One-Minute World News
Last Updated: Saturday, 10 September 2005, 00:03 GMT 01:03 UK
What education statistics reveal
By Gary Eason
Education editor, BBC News website

school classroom in Finland
Another batch of global education figures is due from the OECD
How much do we understand the statistics sloshing around in education?

I was musing on this after a reader accused us of increasingly using "woolly statistics" that were, he said, "meaningless".

We had another crop of statistics this week with the publication of the Key Stage 3 national curriculum test results for England.

And recently of course there was the big annual round of high-stakes public exam results from across the UK.

We go to some lengths to try to report these accurately and in useful detail.

But I can't help wondering if people take from them little more than a headline impression - and, even then, I wonder just how vague the impression is.

Confusion

The handling of even simple statistics can reveal worrying evidence of the nation's fragile adult numeracy levels.

For instance, I was sent a press release this week on behalf of a new city academy, trumpeting the improvement in its test results.

In maths, it said, "45% of students gained Level 5 or above, as against 36% in 2004 - an increase of 9%".

I thought they were rather hiding their light given that the increase, on those figures, was actually 25%.

This inability to grasp the difference between percentages and percentage points seems to be rife.

If you know what I mean, skip the next few sentences.

Example

Let's say I run a widgets factory. Daily production goes up from 30 widgets to 40.

I hope you can see that is an increase of a third. One third is, in percentage terms, 33.3%.

Now let's say I'm running a school and the percentage of students passing an exam rises from 30% to 40%.

I hope it is clear that is also an increase of a third, or 33.3%.

It is clearly not an increase of 10%, but that is precisely what people tend to say when discussing such things as exam pass rates or the proportions getting certain grades. They mean a rise of 10 percentage points.

Even statements from government agencies - and, I might add, BBC programmes - sometimes fall into this trap.

Oh well, yes, but it's obvious what we mean, isn't it, no need to be such an anorak .... But is it quite so clear?

Move beyond such basics and the potential for confusion is enormous.

Improvement

Recently we had the GCSE results issued at a national level by the exam boards in England, Wales and Northern Ireland.

You will probably also have read something about how schools in your area did this year.

A local newspaper I looked at while on holiday last week was applauding an improved GCSE performance at one school in terms of the usual benchmark, "five or more A* to C grades".

But, it added, the results were still far short of the national average, which it gave as 61.2%.

I don't know what its readers concluded from this, but it was gibberish.

We do not know yet what the national average was this year.

Missing

One thing that is hard to convey when the exam results come out - especially with all those shots of individual students getting their results - is that the national statistics show the performance of exam entries, not students.

That is what the exam boards report: the proportions of entries achieving each grade.

So we know, for example, the proportion of entries awarded grade C or above: the 61.2% quoted by that local paper.

But this is very different from the proportion of students who will have five of those - which last year was 50.2%.

The crucial missing element so far is how many exams each student took.

That information has to be collated from each school and has yet to be reported at national levels - for England, the Department for Education and Skills expects to issue it in October.

And it might be that although more exam entries got better grades this year, the overall performance of students has not improved and might even have got worse.

Publication

If this is stretching your credulity, look at what happened two years ago.

In the GCSE results published in August 2003, the proportion of entries getting grades A* to C was 58.1%, up 0.2 on the previous year.

But we subsequently discovered that the proportion of students getting five such grades had fallen - for the first time in the history of the qualification, from 50.2% in 2002 to 49.7%.

You might be forgiven for having missed this because the statistic was not published officially - we had to make a special request.

The exam boards publish the results of GCSEs and vocational qualifications separately - so when we refer to "GCSE results" in the summer we mean just that.

It might come as a surprise that, in the mass of statistics published each year about exam results, nowhere do government education departments actually report what percentage of teenagers get five or more GCSEs at grade C or above.

They report the percentage getting any subject, or any combination of English, maths, science and a modern language, for example.

But when it comes to the benchmark of five or more, they report GCSEs and equivalent grades in other, vocational qualifications lumped together. So do schools.

Incidentally, ministers could easily have undermined those who said GCSE standards must have fallen because more students were doing better in them, by pointing out that, actually, they weren't.

But nowhere that I know of in 2003 did an education minister say "look, the GCSE results have got worse - where's your dumbing down now?" or, last year, "the results are the same as in 2002" (which is what happened).

So what?

So it might turn out that the GCSE performance of students as a whole this year has worsened.

There are signs of this in the exam results. In modern languages, for example, grades improved but fewer students took the exams - so, pundits reckon, weaker students had opted not to do them.

And overall across all subjects grades improved - but there were fewer entries.

There were fewer students too, because of the falling birth rate 16 years ago, but the drop in entries was bigger than that would account for.

It will be interesting to see the figures when they do appear.

But does all this matter?

Not if we know what we're getting, but if that local newspaper report I was reading is any guide - in effect comparing apples and pears as well as misunderstanding the national results - there is considerable confusion around.

Statistics inform policy and the deployment of huge sums of taxpayers' money. And in a democracy, it matters that they should be widely understood.

Even if people do not vote in elections, they vote with their feet - using school performance tables, for example.

Long recognised as a crude measure, attempts to make the tables more meaningful risk introducing levels of complexity which are bewildering to those who are not statistical experts - while those who are tend to be exasperated by the simplistic interpretations often put on the results.

Are you confident about "confidence intervals"?


SEE ALSO:
GCSE results rise at all grades
25 Aug 05 |  Education


RELATED INTERNET LINKS:
The BBC is not responsible for the content of external internet sites


PRODUCTS AND SERVICES

Americas Africa Europe Middle East South Asia Asia Pacific