Casting an analytical eye over the comparative popularity of PISA, TIMSS and PIRLS as large-scale international assessments over the past 15 years, John Jerrim from the UK’s UCL Social Research Institute found PISA has attracted significantly more attention online – but without any methodological reason for this being the case.

“It is basically marketing,” he tells EducationHQ.

“The OECD has a great media machine. And Andreas Schleicher is a great speaker/salesman. But that is basically it.

“In terms of quality, PISA is no better than other international studies like TIMSS or PIRLS. A case could even be made that other studies are superior.”  

The study provides the first quantitative evidence showing PISA’s comparative popularity, which held true across almost every country.

“Anecdotally, we knew PISA was getting more attention. But there had been no attempt to ever measure it, and by how much. (And how it varied across countries). So I wanted to be one of the first studies to try and do this,” Jerrim says. 

The paper calls out PISA’s disproportionate influence on public education debate.

Drawing on Google Trends data, Jerrim analysed the release of TIMSS 2015 results versus those from PISA of the same year – that were released just a week later – finding a “clear, striking difference between the two”.

“In terms of magnitude, globally, the release of the PISA results received around 10 times more attention than those from TIMSS.

“This is despite TIMSS covering both primary and secondary education (PISA is secondary only) and the focus of PISA in 2015 being science (one of the subjects covered within TIMSS),” the research states. 

Jerrim offers some speculation on what’s behind PISA’s widespread appeal.

He proposes that one reason is its focus on the ‘real world’ application of skills – compared to TIMSS and PIRLS’ focus on achievement in an international curriculum.

Successful branding could also well be behind it, too, the researcher poses.

“Related is PISA’s attachment to the OECD’s global brand, and whose creator (Andreas Schleicher) has played a very prominent role in disseminating findings to policymakers and the media.

“Indeed, one could argue that the reason why PISA has received so much more attention than other ILSAs is because the OECD purposefully set out to do so, branding and marketing the study in such a way to maximise media, public and policy attention,” the study elaborates.

The skewed attention given to PISA across the world stage could be unhelpful, if it leads to a situation where we have a “single study driving global education debates, rather than evidence from across multiple ILSAs being used holistically”, the paper warns.

Jerrim suggests that the IEA should “try to do even more” to raise the profile of TIMSS and PIRLS so that they are seen to be on an equal footing to PISA.

“It would also be beneficial for the OECD to be part of such endeavours as well, highlighting the complementary evidence from across all ILSAs – not just PISA – when they engage with the media and attempt to influence education policy debate,” he adds.

The research also identified where interest in PISA has been comparatively low, with Australia among those countries that paid the least attention to its results.

Other Anglophone countries in this group included the Republic of Ireland, New Zealand, United Kingdom, Canada and the US.

“Despite participating since 2000, and with some experiencing sharp declines in performance (eg. Australia) or otherwise disappointing scores (eg. United States), PISA has received less attention in these countries than elsewhere,” the study says. 

This could be due to the several other competing sources of educational data available to draw on in these countries, including surveys, administrative records and standardised assessments, Jerrim offers.

Yet, even for PISA, global interest seems to have reached its peak.

“When PISA first came out, it to some extent was new and exciting,” Jerrim says.

“But, largely, it’s now telling us the same thing each time (or often very similar things). The OECD keeps adding bells and whistles to try and keep it interesting, including measuring different domains.

“But it’s like when a TV series adds a new character or rule to a quiz-show game – it’s often not really that game changing in the grand scheme of things.”

Interestingly, while the relative attention PISA receives continues to be greatest within the OECD, the main ‘market’ for TIMSS now seems to be in lower and middle-income contexts, such as Oman, Palestine and Bahrain, the research flags.

Globally, interest in ILSAs seems to have peaked in 2012 and has been on the decline since.

There is substantial cross-country variation, Jerrim clarifies, with increasing interest in some countries over the last decade (such as Sweden and Turkey) offsetting some of the fall in others (such as Japan and Germany).

The influence and high profile of international assessments over the last 25 years cannot be understated, Jerrim argues. 

Their data has translated into significant political and policy impact in several countries, he notes, including motivating or justifying major curriculum and assessment reforms in Australia, Wales, Ireland, Hungary and Mexico, amongst others.

The researcher would now like to see a “better appreciation for the other studies that are out there” as a result of the research.

“...smart people will always looks across multiple pieces of evidence when they are forming their judgements,” he says.