[an error occurred while processing this directive]
BBC News
watch One-Minute World News
Last Updated: Thursday, 15 January, 2004, 17:39 GMT
How 'value added' works
There has been much talk about how well schools in England have done on the new "value added" measure introduced last year.

But how is it calculated - and what do the results actually mean?

The gist is simple: it indicates how well a school has brought on pupils from one test level to another.

To say it is complex is an understatement - but in summary it is fairly straightforward:

Each pupil's performance in a set of tests is compared with the middle performance of all pupils nationally who had a similar performance at the previous test level.

Someone who is clever to start with is compared with other clever children - so the result does not depend on how well they do in outright terms, but how much they have improved, whatever their ability.

This means a special school for children with learning difficulties can be compared with a grammar school that selects children by academic ability.

Indeed special schools, often small and giving students intensive support, score very highly on value added - 27 had better scores than the best mainstream school in the 2003 tables.

In secondary schools there are currently two measures:

  • between the Key Stage 2 tests at the end of primary school (roughly age 11) and Key Stage 3 tests (age 14)
  • from Key Stage 3 to GCSE/GNVQ exams two years later
Eventually there will be an "all-through" measure from Key Stage 2 to GCSE.

But because it involves tracking each year group of pupils, it has taken five years to produce the first results.

A pilot has been run this year with a view to nationwide publication in the 2004 performance tables.

Each pupil ends up with a score of plus or minus a number of points.

Results of all the pupils in a school are averaged and added to 100, which is the middle level nationally - so you end up with scores such as 95 (below average) or 105 (above).

Among other, associated data, there is a "coverage indicator".

This is the proportion of students in a school included in the calculation - because for various reasons both sets of results are not always available for everyone.

If the coverage falls below 50% the result is not published.

Differences

On the face of it, a simple range of numbers with which to compare schools.

But this is where it gets really complicated, and why the new measure is proving controversial.

Although the scores are calculated to one decimal place, making them appear very precise, the Department for Education and Skills says differences of even a few points are not statistically relevant.

To muddy it even more, the significance that can be attached to the differences varies, depending on how many children were involved in each school - the "cohort", as educationists like to call them.

The smaller the number, the less reliable the outcome.

Confidence

The education department has taken to issuing warnings which tend to get lost in the broad media coverage.

For example, it says that, "as a guide", for schools with 50 or more students, scores of 97.5 to 101.9 are broadly average, while for 100 or more students, scores of 98.1 to 101.3 are broadly average.

Statisticians call these ranges "uncertainty" or "confidence" intervals.

Harvey Goldstein of London University's Institute of Education says: "Mention of such intervals is confined to a technical note and even this gives the minimal information so that only an experienced statistician would be able to apply the information to schools generally.

"The current system of publishing performance tables is still very flawed," he says, and "value added" in effect does not do what it says on the can.

Other awkward questions seem to arise.

For example, girls' schools dominate the top of the value added league table.

If the value added measure really shows how much a school brings on its pupils, does that mean teachers are more favourably disposed towards helping girls?

Or is it just that - as other evidence suggests - girls are the better learners? In which case does the value added measure really show what it is supposed to?




BBC NEWS: VIDEO AND AUDIO
The BBC's Sue Littlemore
"Of the top 100 schools 76 are non-selective"



English secondary schools 2003

ENTER A FULL POSTCODE
OR SEARCH BY AREA:
 


BACKGROUND

RELATED NEWS

English primary school tables 2003

TYPE IN A FULL POSTCODE:
OR SEARCH BY AREA:
 


RELATED NEWS

Scotland, Wales and Northern Ireland do not publish tables.


RELATED INTERNET LINKS:
The BBC is not responsible for the content of external internet sites


PRODUCTS AND SERVICES

News Front Page | Africa | Americas | Asia-Pacific | Europe | Middle East | South Asia
UK | Business | Entertainment | Science/Nature | Technology | Health
Have Your Say | In Pictures | Week at a Glance | Country Profiles | In Depth | Programmes
Americas Africa Europe Middle East South Asia Asia Pacific