As a right-of-centre media giant, they view all public data as fair game. They are not the business of being told how to write their stories.

Unless someone has the time and resources to sting them in Court for being defamatory, it is all about eyeballs and profit margins.

The teaching unions and their affiliates are the most trenchant opponents of standardised data. They penned a joint open letter on the same day demanding News Corp “act responsibly and end the publication of these crude and harmful tables”.

Describing themselves as ‘education leaders,’ the teaching unions and school groups have been NAPLAN opponents ever since the Howard Government developed the testing process two decades ago, which the Rudd administration commendably retained. 

What is remarkable is their latter-day conversion to NAPLAN, so long as growth rather than raw scores are published. It appears they do support this intrusive testing afterall, so long as it is used to highlight how schools achieve great progress.  

So if publishing NAPLAN ‘growth’ data is finally everyone’s happy place, why has it never been done before?

Putting it bluntly, someone has to do the work. News Corp clearly have no intention of flirting with metrics that risk upsetting their biggest advertising clients; the elite schools perennially topping their league tables.

The other side of the debate, those behind the open letter, have been so witheringly opposed to NAPLAN they were never likely to engage in gain reporting at the school level.

The occasional Australian academic analysing gain has been politely ignored, or told no ‘seminar series would facilitate the presentation of (their) results’.

There is no gain in being a school gain academic in Australia.

Reporting progress must therefore fall to state governments that collect NAPLAN and generate ATARs, or they hand the latter over by Ministerial agreement to Canberra so the Australian Curriculum, Assessment and Reporting Authority (ACARA) can do it.

ACARA should report overall school growth between test sittings and compare it to co-located and similar schools, all the way through to ATAR and vocational outcomes.

Forcing us to click in and out of each year group’s NAPLAN domain gain graph is both frustrating and pointless.  

Only ACARA can present growth data adjusted for socio-economic background and school academic mix. NAPLAN mean scores are not essential if error bands are provided, and any standard deviation (SD) from prediction is identified by colour.

An example of using ‘colour’ analysis capturing relative score shift over time is this year’s NAPLAN comparison with 2023 data for the same year groups in Brisbane’s lower-fee TAS and single-sex GPS schools - not a great result for schools without female students.

A graph with lines and numbers

AI-generated content may be incorrect.

Figure 1: The two-year mean NAPLAN shift for 2025 Year 5, 7, 9s by school sector (SDs since 2023).

Student mix matters more than schools

Another option comes from recent work in Queensland supervised by Professors John Hattie and Mark Wilson. We examined student percentile shift between NAPLAN testing and ATAR across 22 independent secondary school cohorts.

The substantial student-level variation (below) is lost when My School only reports school means.

A graph with blue dots

AI-generated content may be incorrect.

Figure 2: Student percentile shift between mean best NAPLAN reading/numeracy and ATAR, relative to their baseline NAPLAN percentile.

We identified a host of NAPLAN factors associated with ATAR outcomes that differ according to student NAPLAN baseline.

When similar-NAPLAN students were compared across different-ICSEA independent schools, we found the higher gains for top students at elite schools to be entirely due to smaller ATAR slips in their underperforming students.

A graph of a graph of a graph

AI-generated content may be incorrect.

Figure 3: Student growth varies by school ICSEA - <7th Decile NAPLAN students doing an ATAR fare better at lower ICSEA schools.

The reverse was the case for less-academic students in ATAR streams, whose gains were higher in lower-SES than elite schools (black fitted line above).

This supported earlier work on big fish in small ponds effect, suggesting that student mix and self-efficacy may matter more than getting into the best schools.

Academic selection is rife

There is vigorous competition for top academic students in cities. This ‘cherry-picking’ intensifies as the commuting time to a more elite school falls.

School ecosystems appear to operate like football league relegation zones, with both top and bottom-scoring NAPLAN students more likely to change schools.

Whether due to family choice or school management, this appears to drive the differences in school scores.

Preference for numeracy

Independent schools studied appear more likely to enrol, retain, and ATAR-stream those with strong NAPLAN numeracy, compared to students with similar strength reading comprehension.

Stronger numeracy students gained more between NAPLAN tests, were more likely to do an ATAR, and achieved a higher ranking than similarly strong readers. 

This raises geographic equity questions about whether devaluing literacy in our ATAR system is skewing teaching effort.

Predicting a higher ATAR

We identified two ATAR-predictive features in NAPLAN reports for parents to check. The trend between student NAPLAN results in Years 7 and 9 was likely to continue through to ATAR.

Secondly, having increasingly different NAPLAN numeracy and reading scores was up to 4 ATARs more beneficial than having similar scores.

Reporting NAPLAN and ATAR as growth indicators offers exciting possibilities. We can’t blame the media for reporting what ACARA currently posts.

The final years of secondary school should not be a ‘black box’ where the tertiary ranking delivers shock, embarrassment or disappointment.

Australia has a world-standard school system. Funding is not patchy, and we have to call time on excuses like money or staff shortages for below-par results, when schools in comparable circumstances find a way to deliver.

A third of Australian families pay for independent schooling, and growth data means we can calculate the cost-benefit of paying higher tuition fees.

Trapped within catchments and lacking school choice, government school parents are equally deserving of appropriately adjusted growth data.

They need to be confident that their family members are making social and academic progress towards being prosperous and productive adults.