New report highlights complexity of building reading skills

New report highlights complexity of building reading skills

Evidence for Learning recently published its evaluation report on MiniLit – a small group, phonics-based, program for struggling Year 1 readers.

The trial found that MiniLit did not have an additional impact on passage reading, but that there was evidence of significant improvement in foundational skills, particularly Letter Sound Knowledge which sustained even six months after the program was completed.

It also suggested greater gains for students who attended 80% or more of the sessions. Overall the evaluation has a low security rating due to concerns about the test measure used and the level of change the trial was set up to detect, meaning it needs to be treated with caution.

MiniLit is delivered in school outside of regular class by teachers or paraprofessionals trained as MiniLit tutors, to small groups of up to four students. In this trial it was tested with Year 1 students in the bottom 25% of reading in nine NSW public schools.

Half the students were assigned to 80 unique one-hour lessons over 20 weeks. The other half of students received the school’s usual support for struggling readers. All students’ reading levels were tested after the MiniLit program concluded and the results were compared.

This randomised controlled trial (RCT) was conducted by the Centre for Community Child Health and the Centre for Program Evaluation and commissioned by Evidence for Learning as part of its Learning Impact Fund.

Evidence for Learning Director, Matthew Deeble, said the trial highlights the complexity of building reading skills and the challenge of measuring its development in the critical early years of reading.

“The primary measure of reading, selected at the outset of this trial, was too ambitious for these students meaning the findings need to be treated with caution,” Deeble said.

“But in the process, valuable knowledge has been generated about MiniLit’s positive impact on the development of skills that lead to confident reading and the more likely benefit when receiving the full program.”

Deeble said it shows the value of independently conducted trials, that publicly report on the differences in achievement between trial and comparison groups, evaluate the steps along the way and calculate the costs to get there.

“The evaluation report, its methods and results are available for everyone to learn from. But most importantly, accompanying evaluation resources have been developed for educators,” Deeble said.

“Creating this kind of knowledge is not easy.”

Deeble said this process requires expert evaluators with a range of skills to produce high quality research working in partnership with an education system and its schools.

“However, none of this can happen without the commitment of the program developer who opens their program up to a new level of scrutiny,” he said.