The headlines are everything we would expect them to be – full of panic. Most reporting is focused on the number of Australian students not meeting the new proficiency standards, with talk of “failure” and “debacles”.
The numbers certainly don’t look great, but should we be worried?
Changes to NAPLAN
NAPLAN was introduced in 2008 and is an annual test of all Australian students in years 3, 5, 7 and 9. It aims to see whether students are developing basic skills in literacy and numeracy.
On Wednesday, we saw the overall results released. Individual student reports will go home during term 3, via schools.
Earlier this year, NAPLAN underwent significant changes. These changes included a shift to online testing, moving the testing dates forward and new proficiency standards.
At the time of the announcement, many education experts warned that 2023 results might be lower than usual.
Many pointed to the shift from ten proficiency bands to four achievement levels (“needs additional support,” “developing,” “strong” and “exceeding”). This likely explains a lot of what we’re seeing today. It also means we cannot compare this year’s results with previous results.
The shift to online testing
The shift to online testing may also have had a significant impact on results.
Disparities in access to technology can impact how students perform on the test. Students who regularly use computers and the internet at home are likely to feel more confident while taking an online test. Students without might struggle with basic computer skills. This can lead to more mistakes that have nothing to do with numeracy and literacy.
Changes to the testing window from May to March also means schools had less time to prepare students for NAPLAN in 2023. Theoretically, this might have a positive impact on education in the long run. Less time can be devoted to “test prep” or “teaching to the test”. This can free up time to spend on more authentic learning activities. But for this year, the change caught schools off guard, which may have impacted student performance.
We also shouldn’t forget about the impact of COVID. It is hard to estimate all the ways students have been affected by the pandemic. We can assume these effects will be felt for years to come, and we should continue to interpret NAPLAN results with this in mind.
Disparities and funding
What we should be worried about is the clear disparity between Australia’s most vulnerable students and their peers.
Like every other year, NAPLAN results show significant gaps between Indigenous students and their peers. About one-third of Indigenous students “need additional support”, compared to one-tenth of students overall. Some 50% of students in the most remote regions of Australia also “need additional support”.
This is not a new concern, and one experts have been worried about for many years. While politicians often blame schools and teachers, the real problem is with equitable funding. Public schools are responsible for teaching most students who require additional support, yet they are not adequately funded to do so.
Proceed with caution
We must interpret this year’s NAPLAN results with caution. Our instinct might be to panic, but the reality is significant changes to the test have led to these results. It might take a few years before we can make any meaningful sense about overall progress and change.
We can also look to some experts’ optimism about the changes. They say the new achievement levels and earlier testing dates will eventually lead to simpler and more useful results. They hope this means better communication between schools and families, as well as more time for schools to act.
Importantly, we should not interpret this year’s results as an indictment on schools. Rather, we should force governments to fully fund schools to the level they have said is necessary. This year’s results leave no question about the urgency of equitable funding.
This article is republished from The Conversation is the world's leading publisher of research-based news and analysis. A unique collaboration between academics and journalists. It was written by: Jessica Holloway, Australian Catholic University.
Jessica Holloway receives funding from the Australian Research Council.