Summary results for this year’s NAPLAN assessment were due to be released on Wednesday but have been delayed with concerns the online test results cannot be compared to the results from the paper tests.
On Wednesday, the NSW Secondary Principals Council (NSWSPC) and Australian Secondary Principals’ Association (ASPA) said they shared these concerns.
“We’ve made it clear that we feel the 2018 comparable data will be meaningless to schools simply due to the fact that so many variables are involved,” NSWSPC president, Chris Presland told The Educator.
ASPA president, Andrew Pierpoint, said NAPLAN data is currently used as an “inaccurate measure” to compare schools and has “lost its original focus as a diagnostic learning tool”.
Now academics from across Australia have weighed into the debate, with some warning that the confusion has the potential to render the tests invalid.
“ACARA has not yet released any of the technical information needed to assess the performance of the online version,” Associate professor James Ladwig from the University of Newcastle’s School of Education said
“As a consequence it is impossible to know the degree to which the test is valid and equivalent to older versions.”
Associate professor Jihyun Lee from the UNSW School of Education specialised in large-scale standardised testing, including NAPLAN and PISA. She said that if ACARA was to release the NAPLAN results in two separate forms, “this would suggest a failure to ensure comparability”.
“An independent body of measurement experts should review the most recent NAPLAN data,” associate professor Lee said.
Transparency needed to build trust
Dr Steven Lewis, the Alfred Deakin Postdoctoral Research Fellow at Deakin University, and an expert on education policy, said that acknowledged that NAPLAN can help educators and policymakers understand schooling at the system level.
However, he highlights that “any lack of statistical comparability, be it perceived or actual, between the online and pen-and-paper tests jeopardises its utility as a trusted means of comparison.”
“Such a lack of comparability could mean that comparisons cannot be made between schools using different modes of testing in 2018, or between a single school’s year-to-year performance if the school has piloted the online delivery format,” he said.
“My research has shown the profound impact of NAPLAN data, and comparisons of these data, on how schooling is understood and practised by teachers, schools and systems.”
Dr Lewis said that unless there is transparency around the statistical procedures and experts used by ACARA to make the data ‘valid and comparable’, there cannot be a complete trust in the evidence and comparisons these data provide.
“To this end, I would call on ACARA to release this information to help forestall what is arguably a growing mistrust, amongst the public and education professionals alike, in relying on NAPLAN and standardised test data to inform teacher practice and student learning.”