The UK as a whole does not feature in the latest international test results achieved by 15-year-olds.
More than a quarter of a million students took part
The main reason is that not enough schools in England volunteered to take part, rendering international comparison invalid.
The tests in maths, reading and science were organised by the Organisation for Economic Co-operation and Development.
Results for Scotland, Wales and Northern Ireland separately are contained in annexes to its report.
On Tuesday the OECD published the initial findings from its latest three-yearly PISA study, carried out in 2003, involving a quarter of a million students, representing about 23 million in the participating countries.
According to the UK National Statistics office, which was responsible for the UK data, independent samples in Scotland and in Northern Ireland did meet the PISA technical standards and are comparable with the other countries' results.
MEAN MATHS SCORES
OECD average: 500
Commenting on the Northern Ireland results, it said students there scored significantly higher than average on the mathematical scale, achieving 515 points.
Only Finland and Korea did "significantly" better statistically than Northern Ireland in reading - it had 517.
MEAN READING SCORES
OECD average: 494
Only Finland and Japan did "significantly" better in science than Northern Ireland, which scored 524.
The OECD averages in the three subjects were 500, 494 and 500 respectively.
National Statistics said results for Scotland would be published by the executive alongside other international data next week.
The Scotland results are reported in annexes to the 476-page report from the OECD.
MEAN SCIENCE SCORES
OECD average: 500
They show that Scotland's students did better than those in Northern Ireland in maths, less well in reading and especially science.
Results for Wales are also there - but based on a tiny sample of just 118 students. They did less well than both on all three test areas.
On the OECD's behalf National Statistics invited a random sample of state and independent schools in the UK to take part in the study.
Within those, representative samples of students were also asked to participate.
One of the OECD requirements was that the initial response rate among schools should be at least 85%. In the UK - because of the shortfall in England - it was 64.3%.
The rules allowed for replacement schools then to be asked - with the response rate requirement raised to 95%.
Only 77% was achieved. At the student level the response rate was 78% - 77% in England only - which fell below the requirement of 80%.
The sample that had been obtained was analysed by statisticians for evidence of any significant bias, which potentially could be allowed for.
"The PISA Consortium concluded that it was not possible reliably to assess the magnitude, or even the direction, of this bias and to correct for it."
So it was not possible to say that the United Kingdom's sample results reliably reflected those for the national population, with the accuracy required.
What went wrong
National Statistics said it managed to get results from 176 schools and 3,756 students in England.
The failure to persuade enough schools to participate had occurred despite second and third letters encouraging them to do so, personal contacts by Department for Education and Skills advisers and PISA "advocates", and support from the education unions.
There was also a joint appeal from the department and the OECD, with an offer to reimburse schools for their time, and a longer-than-normal range of dates during which the tests could be sat.
"There was a particular problem with Year 11 pupils who tend not to return to school after their exams," National Statistics said.
"This led to a number of schools which did participate being excluded from the final sample because they could not supply enough Year 11 pupils."
The mean performances that were achieved in England (UK in brackets) were 507 (508) in maths, 506 (507) in reading and 519 (518) in science.
The OECD takes the view that these cannot reliably be compared with other countries or with the UK's performance in the first PISA round in 2000.
It has however calculated likely ranges for the correct UK results:
- Maths: 492 to 524
- Reading: 491 to 523
- Science: 502 to 534
The UK's failure to provide enough data for the study was criticised by teachers' representatives and characterised by opposition parties as shocking and even suspicious.
In the previous study, in 2000, the UK had above-average scores and was fourth for science.