In September, a study by the Australian Education Union (AEU) found that three quarters of teachers say that NAPLAN is ineffective as a method of assessing students.
The ‘State of Our Schools’ survey found that 75% of teachers and 73% of principals do not believe NAPLAN is effective for school comparison, and nearly the same number (74%) believe NAPLAN is effective for measuring school performance.
The finding follows the release of an independent review of NAPLAN by Victoria, NSW, Queensland and the ACT which proposed sweeping changes to the writing assessment and a greater focus on critical thinking and science.
However, the debate over what should be changed about NAPLAN, and if anything should be changed at all, has divided education in Australia.
The Educator recently invited three prominent principals to share their views about this in an online panel discussion.
Greg Miller, principal of St Luke’s Catholic College in Sydney, said the original purpose of NAPLAN was to provide baseline data around student progress in literacy and numeracy has overtime been “hijacked by a league table approach of schools”.
However, Derek Scott, principal and CEO of Haileybury, said NAPLAN provides very helpful data which is now longitudinal over 12 years to track progression of student performance.
“I don’t disagree with Greg that while the component on the MySchool website has been taken on in a different way. The problem is not with NAPLAN, it’s with how we look at it on the MySchool website,” Scott said.
“The government has already made some good inroads when it comes to pulling that back. If you go to the MySchool website now, you can’t go on to it without signing your life away if you were going to use that data to produce league tables”.
“As a tool that’s been used to improve student performance, it’s been monumentally unsuccessful,” Grossek said.
“From that angle, is it fit for purpose?”
However, Scott said NAPLAN shouldn’t be blamed for failing to improve school performance.
“NAPLAN is an assessment that gives us a measure. The measure was always meant to be used by policymakers to set the direction and the policies that can then improve the outcomes,” he said.
“What we’ve got is the measure. What we haven’t had is the flow on from that to look at what’s working and therefore what the policy outcomes might be that can improve our assessment, so I wouldn’t want to throw out the assessment that gives us that longitudinal measure. We should be looking deeper at what works”.
Scott said NAPLAN data from across Australia shows there are 5% of schools that are outperforming relative to their socio-economic background and disadvantage.
“That five per cent of schools are the ones we should be looking at to see what we can take from them and use in a broader policy setting to improve literacy and numeracy outcomes for everyone”.
Grossek said people have overestimated NAPLAN’s value as a tool to be used in schools to improve school performance.
“Yes, five per cent of schools outperformed relative to their SES, but ninety-give per cent didn’t, so it depends which stats you want to look at from that point of view,” he said.
“NAPLAN has become the holy grail of all school assessment tools, but it’s just one of a suite of measures, and I’ve always felt the best value of NAPLAN is as a diagnostic tool at a school level, but it has been hijacked”.
Grossek said the bigger challenge is how to “un-hijack” the assessment.
Miller said he believes the greatest value of NAPLAN is in the feedback specific to each individual student.
“This feedback provides deep insights into very granular aspects of a child’s ‘one moment in time’ in the areas of literacy and numeracy,” he said.
“So, from that perspective, when used in school to complement other formative and summative assessment processes, NAPLAN can be very useful. However, has been misused over a long period of time to suit other people’s agendas rather than be focused on the learning growth of each individual student”.
*This is part one of a two-part series exploring NAPLAN as a system of student assessment.
The previous panel discussion, Maintaining student engagement during COVID-19, can be viewed here.