BBC NEWS Americas Africa Europe Middle East South Asia Asia Pacific
BBCi NEWS   SPORT   WEATHER   WORLD SERVICE   A-Z INDEX     

BBC News World Edition
 You are in: UK: Education: Mike Baker  
News Front Page
Africa
Americas
Asia-Pacific
Europe
Middle East
South Asia
UK
England
N Ireland
Scotland
Wales
Politics
Education
Business
Entertainment
Science/Nature
Technology
Health
-------------
Talking Point
-------------
Country Profiles
In Depth
-------------
Programmes
-------------
BBC Sport
BBC Weather
SERVICES
-------------
EDITIONS
Saturday, 5 October, 2002, 01:23 GMT 02:23 UK
A-levels: Time for a change

After the past few weeks we probably all think we know what is meant by the phrase "A-level fiasco".

You know a story has caught the public's attention when it can be reduced to a catchy headline.

Most of us struggle to pin-point exactly where things went wrong

But do we really yet know what this "fiasco" is all about?

We have had two major inquiries, one sacking and now a potentially huge re-grading exercise.

Yet most of us struggle to pin-point exactly where things went wrong.

Indeed we don't even know for certain that something did go wrong. The exam boards still insist that the grades given this summer were a proper reward for students' abilities and effort.

It is possible that the re-grading process will simply re-confirm the summer's results. Unlikely perhaps, but possible. Then where would we be?

Uncertainty

The head of the independent inquiry, Mike Tomlinson, had the toughest of tasks: cutting through a Gordian knot of exam-grading mystique against the tightest of deadlines.

Mike Tomlinson
Mike Tomlinson has a tough job
His findings on the long-term causes of the exam problems are very clear. But on his own admission he has not yet been able to pin-point exactly what went wrong this summer.

So even he cannot yet say for sure whether the re-grading exercise will lead to changes in students' grades.

So what are these long-term causes which made A-levels so fragile that, in Tomlinson's words, they were "an accident waiting to happen"?

The early days

First, there was the impatience of the government to introduce the new A-levels. The then education secretary, David Blunkett, allowed only a small-scale pilot of the AS-level and there was no trial run at all of the A2, the second part of the new-style A-level.

Both Blunkett and the regulatory Qualifications and Curriculum Authority (QCA) over-ruled warnings from the exam boards who had asked for an extra year to implement the new exams.

David Blunkett
The changes were brought in under David Blunkett
The second underlying problem was the long-term failure to clarify what exam grading is actually about.

At its root, says Tomlinson, is a misunderstanding of the difference between maintaining standards from year to year and - something quite different - maintaining roughly the same proportion of students getting particular grades.

Tomlinson says this misunderstanding between maintaining standards and maintaining pass rates exists at "all levels of the system" - in other words, even within the upper echelons of the exam boards themselves.

Mysterious vacuum

For these reasons, Tomlinson prefers to talk about "systemic failure" than to apportion individual blame.

But this still leaves a mysterious vacuum at the heart of this whole affair: what was the trigger which turned this underlying fragility in the A-level system into a crisis this summer?

This is where it gets difficult as we cannot yet say how wrong this summer's results were.

Moreover, even if the original grade boundaries are restored in every case, we will not see an end to the puzzlement and fury over candidates who expected A grades in certain units but were actually failed.

It is mathematically impossible for the re-grading to turn a U grade (unclassified) into an A or a B. The most it could do, says Mr Tomlinson, is to go from a U to a C. And even that could only happen "in extremis".

Schools' anger

Yet many schools are adamant that top quality work has been failed. They point to many students with As in five units and a U or an E in the sixth. Something does not add up.

Whatever the exam boards did at the last minute to adjust grade boundaries, it cannot have been sufficient movement to account for all the disappointments expressed by candidates and schools.

One factor is still likely to be the failure of teachers and students to understand exactly at what level the A2 was pitched.

This does not reflect badly on them. How could anyone be clear about the standard of the A2 when, quite obviously, the exam boards were not either?

Without a trial run, everyone was guessing how to assess the second part of the A-level which, although worth 50% of the marks, was supposed to be at a higher level than the first half.

And it was this stabbing in the dark which lies at the heart of all this.

Previous standards

Every year exam boards try to set grade boundaries at a level which both maintains standards and roughly stays in line with the pass rates of previous years.

This year they had no previous A2 standards to compare with. They could, of course, make comparisons with the old-style A-level. But that was a different animal.

So instead they relied more heavily than usual on statistical comparisons with previous years' pass rates.

They were encouraged to do this by the pressure they perceived to be coming from the QCA. Indeed they were threatened with a public inquiry if the pass rates differed significantly.

Fewer candidates

But at this point QCA and the boards seem to have lost sight of the fact that the proportion of students going on to the A2 was lower than the proportion taking the full A-level in the past.

The reason for this was quite simple. Students who had not done well in the AS sensibly decided not to go on to the A2.

exam candidates
This filter process at the half-way stage meant a higher pass rate was inevitable.

It could also have been justified as quite compatible with maintaining standards if only observers had looked beyond the overall pass rate to the numbers actually passing A-level.

Amid all the fuss over this summer's record pass rate few observers noticed that considerably fewer A-levels were awarded this year than last.

Now, if the re-grading process lifts grades further, those who criticised an exam where almost every entrant passed had better prepare themselves for a bigger shock.

The re-grading process could take us very close to a 100% pass rate.

And that, inevitably, will take us back to the much more fundamental questions about what sort of examination system we want.

Swimming test approach

Do we want exams whose prime purpose is to place students in rank order, so we can distinguish the brilliant from the very bright, and the very bright from the averagely able?

If so, we should go back to pre-1988 O and A-level marking arrangements where the proportion of candidates getting an A or an E was pre-ordained and stayed the same from year to year.

On the other hand, if we simply want to know whether our students have reached a certain standard of achievement then, as with the 50-metre swimming test, there cannot be any quotas.

If one year every candidate can swim then the pass rate is 100%. If in another year no-one can swim then it is 0%.

There can be no annual hand-wringing about how the water has become more buoyant or the pool shorter.

By and large, our current exam system favours the latter approach: no fixed quotas for pass rates. If you can do it, you get the grade. On this basis, previous pass rates are irrelevant.

But at the same time the idea of quotas has never been completely abandoned.

Looking backwards

Every summer, exam boards, ministers and the public continue to look over their shoulders to compare this year's pass rate with last year's.

They did that this year. The question is whether they took too much notice of what they saw in their rear-view mirrors and not enough of the evidence of this year's scripts in front of them.

In fact, it looks as if only one exam board spent too long looking backwards. The last-minute grade boundary changes at Edexcel and AQA were relatively few and relatively small.

Ron McClone
Ron McClone's board, OCR, is at the centre of the row
But the Oxford, Cambridge and RSA Examinations board (OCR) focused more heavily than others on previous pass rates.

They thought they were being told to do so. As Tomlinson observed, OCR's examiners believed they were expected to "have very strong regard" to the grade distributions from the previous year.

This is why the vast majority of the A-levels being reviewed come from the OCR board.

And, since the independent schools mostly opt for OCR exams, this too explains why the initial hue and cry came from the independent sector.

This might seem to imply guilt on OCR's part. But in fact they were within the rulings of the code of practice which governs all exam boards.

This requires them to look at both current scripts and statistics from the past. Their error was in the weight they gave to the latter.

Where now?

So where do we go from here? Most importantly, we need to decide what A-levels are for. They cannot continue to bear so many different burdens.

At present, A-levels try to do all of the following: measure students' mastery of the syllabuses, filter students into a pecking order for university admissions, provide a guide to the relative performance of schools, and act as some sort of "gold standard" or permanent measuring stick for standards over time.

These functions now need to be disentangled. Perhaps universities will have to assess candidates in different ways: either by a US-style SAT entrance exam or by giving greater weight to interviews.

Or maybe we should just accept that attempting to maintain standards over time is a mystical alchemy that cannot be mastered.

The truth is you cannot have one exam which both tries to measure the same standards year by year and provides a pecking order of candidates. It is one or the other.

Questions about whether to have a more independent body to replace the QCA or to "nationalise" the exam boards are secondary to the more fundamental question about the function of A-levels.

Hindsight is, of course, a wonderful aid to vision. There may be valid excuses for what happened this summer, but for the students' sake lessons must be learnt.


We welcome your comments at educationnews@bbc.co.uk although we cannot always answer individual e-mails.

The alleged A-level grades manipulation

Latest news

TOMLINSON INQUIRY

FEATURES
See also:

04 Oct 02 | Education
02 Oct 02 | Education
02 Oct 02 | Education
Internet links:


The BBC is not responsible for the content of external internet sites

[an error occurred while processing this directive]

 E-mail this story to a friend
[an error occurred while processing this directive]
© BBC ^^ Back to top

News Front Page | Africa | Americas | Asia-Pacific | Europe | Middle East |
South Asia | UK | Business | Entertainment | Science/Nature |
Technology | Health | Talking Point | Country Profiles | In Depth |
Programmes