NAPLAN does little to prepare students – expert

NAPLAN does little to prepare students – expert

A study of 211 NSW English teachers has revealed heavy criticism of NAPLAN’s usefulness and relevance to students’ achievement, lives and future prospects.

The survey, jointly conducted by Dr Don Carter is a senior lecturer in Teacher Education at UTS, associate professor Jacqueline Manuel from the University of Sydney and Dr Janet Dutton from Macquarie University, asked teachers to rate their agreement or disagreement with statements about NAPLAN, and gave them a chance to write their own responses.

Common themes emerged from the researchers’ analysis of these responses, including that NAPLAN tests added little to teachers’ understanding of students’ literacy levels, the assessment was a poor measure of student achievement and that the test detracted from other learning opportunities.

Dr Carter cautioned that the study, which recruited its participants through the NSW English Teachers’ Association social media platform, should not be treated as a statistical analysis, but emphasises that it gives voice to professionals well-positioned to judge the test, bolstering existing evidence on both NAPLAN, and on standardised testing overseas.

Teachers cast doubt on NAPLAN’s impact

Many of the teachers surveyed in the research disagreed with the statement that NAPLAN provided them with “important information on the literacy skills” of their students.

"I don’t need NAPLAN data to tell me which ones [students] can’t read or spell,” said one teacher.”

Few teachers considered that the tests were “an appropriate way to measure student achievement”.

“The tests capture only a narrow aspect of a student’s capabilities,” said one participant.

Most teachers disagreed that NAPLAN tests “are an important component in preparing students for future employment.”

One teacher commented that the tests “couldn’t be less related to students’ lives if they tried”; another that the tests “have absolutely no relevance.”

Teachers were divided as to whether NAPLAN influenced their teaching practices, though many commented that they felt strong pressure to teach to the test.

“I feel forced to teach to the test,” said one teacher.

Another observed that “Every year we are told not to teach to the test, yet each year we are forced to do blatantly obvious teaching tasks that are ascribed to NAPLAN.”

A large portion of written comments on this topic indicated that “the individual school requires the adoption of teaching/learning strategies based on NAPLAN.”

Several participants suggested they spent class time introducing students to the structure of the test for the sake of student welfare – in the words of one teacher, “because the kids are often anxious about it.”

Validity of NAPLAN ‘highly questionable’

The researchers also found that the external, institutional and other pressures on teachers to ‘teach to the test’ have an adverse impact on many teachers, serving to erode their professional agency and autonomy and engender frustration, anger and resentment.

“They also consider that the validity of the NAPLAN tests and the credibility of the data on student performance is highly questionable,” the researchers wrote.

“Further, despite the extensive government rhetoric surrounding the purpose of NAPLAN and the enormous financial resources directed to and derived from the tests, students’ achievement in literacy has not been lifted.”

The researchers suggest that it is not always clear to parents that the tests are not compulsory.

“Information about how a parent can withdraw a child is buried on the ACARA website.

Similarly, on the NESA website, it is not made clear to parents what the withdrawal arrangements are.”