The pitfalls of digital NAPLAN

The pitfalls of digital NAPLAN

On Wednesday 13 March, more than 1.3 million students in Years 3, 5, 7 and 9 began sitting the National Assessment Program – Literacy and Numeracy (NAPLAN), which assesses young people on their reading, writing and language skills.

The test, previously held in May, is now being held in March to give teachers early results to inform learning and teaching for the rest of the year. This is particularly important as data from 2023 shows nearly 10% of school students require additional assistance to meet the minimum benchmarks in literacy and numeracy.

More recently, a report from the Grattan Institute revealed one-in-three Australian students cannot read proficiently, calling into question the effectiveness of the ‘whole-language’ approach which has been widely used in schools since the 1970s.

Declining student outcomes have led some to blame NAPLAN for being too narrow a measure of student success, and unnecessarily stressful for teacher and student alike.

A 2023 survey found a majority of the general public continue to hold a negative view of NAPLAN, with most (61%) holding the view that there is “excessive emphasis” on the test.

Some experts, like Professor Karen Murcia, a specialist in STEM education at Curtin University, point out that there remains strong skepticism and criticism towards the NAPLAN testing system in 2024.

“Many voices are emphasising the inefficacy of preparing students for the test, the time and resource allocation involved in schools, and the lack of meaningful outcomes it provides to children,” Professor Murcia told MCERA recently.

“Testing in March and going digital, may not be the promised panacea.”

Professor Murcia says it became apparent last year that testing in March and using digital marking still didn’t get the results into schools and with teachers in a timely manner, with some as late as October.

“School leaders and teachers questioned what the test results showed that they didn’t already know about their classrooms and children’s learning needs,” she said.

“Arguably, for the data to be useful and to inform teaching, moving the testing to the end of the year could provide an insight into what children achieved in that school year. Informed by the data, teachers would then be able to address children’s learning needs and knowledge gaps when planning at the start of the following school year.”

Professor Murcia notes there are many challenges faced by schools in planning and conducting digital NAPLAN testing, especially those with limited resources and children with low digital capabilities.

“Issues such as access to technology, digital literacy skills, and the time-consuming nature of online testing are challenging schools and taking time away from quality learning for the children who need it most,” she said.

“There are still schools that are not adequately equipped with suitable computers, related hardware, and reliable access to the internet. With an insufficient number of devices, it is a challenge for schools to ensure children have access to the digital NAPLAN tests.”

Further exasperating the inequality, says Professor Murcia, children have not necessarily progressed sufficiently on the continuum of ICT capabilities to manage and operate test question functions.

“The numeracy testing requires fluency with a range of mouse and keyboard functions and literacy testing requires adequate typing skills for success. Is NAPLAN now also indirectly testing children’s digital capabilities and is this clouding test results?” she said.

“Arguably, the ability to show working-outs and mathematical reasoning has also been lost through the efficiency of the digital click, drag and drop and selecting from a dropdown option box.”