Can International Rankings Answer Questions About Ed?

Ever since the OECD began publishing the results of the Programme for International Student Assessment (PISA), which compares the quality of schooling around the world, political leaders, education experts and others have stressed over how students in their own countries stack up against their peers elsewhere. Education policy is often based not only on the [...]

Ever since the OECD began publishing the results of the Programme for International Student Assessment (PISA), which compares the quality of schooling around the world, political leaders, education experts and others have stressed over how students in their own countries stack up against their peers elsewhere. Education policy is often based not only on the needs of the nation’s students in isolation, but also on how well they are ranked on international measures.

Critics have long warned that treating PISA and similar tests as a major benchmark is counterproductive since they don’t answer the most important question raised in the wake of their publication: Why are the results as they are? Now at least one researcher is attempting to do just that by digging into the results of these exams to figure out where the differences lie.

“The Learning Curve” is published by Pearson (our part owner) and compiled by the Economist Intelligence Unit (our sister company). It uses the existing measures, but adds criteria such as graduation rates, adult literacy and the effect of years in school on productivity.

Looking at the results in a new light doesn’t change much at the top level. School systems in Finland, South Korea and Hong Kong comfortably retain their perch at the top of the list even when looking below the surface. Yet after the top five are eliminated, there’s a lot of shuffling in places 6th and below.

When other factors were taken into account, Britain ended up being in 6th place – a respectable showing in light of the fact that on the original exam it rated only 25th.

Both lots of results come with caveats. Citing sampling and other flaws, statisticians had contested the validity of England’s poor PISA score. And the new study has to deal with difficulties in comparing graduation rates, and in assessing places such as Singapore and Hong Kong, where the best students may study abroad.

The more developed a country is considered, the more likely it is to take its PISA showing to heart. This is obvious in France, where its placement at 25th in 2000 led to hand-wringing and a push for education reform. A similar showing in Germany in 2000 led to changes that have bumped its performance, especially in Saxony and parts of old East Germany. But even with those improvements, Germany still notched below Poland in academic quality on the latest round of exams.

Politicians and campaigners relish using international data in support of arguments for particular reforms. But the components of success are mostly too diverse to bear that out. Sir Michael Barber, a former adviser to Tony Blair’s government and now Pearson’s head of education, says the correlations between school structures and outcomes are weak, with no sign they have got any stronger since the mid-1990s.

Tuesday

On Twitter

Google to promote coding #education for its female and minority employees http://t.co/i9HD40a7gw #edtech #edchat

5 days ago

With $2bil price tag, FCC approves updates to E-Rate program for school, library Wi-Fi http://t.co/s6aHachDcH #edtech #education #edchat

5 days ago

George Fox Univ granted exemption on transgender dorm case http://t.co/hCMGMDVEoX #highered #edchat #education

5 days ago

On Facebook

Recent News

Career Index

Plan your career as an educator using our free online datacase of useful information.

View All