NewsBite

exclusive

Doubts over Reading Recovery program data

A landmark study into a popular literacy program has been criticised for omitting data in its final evaluation report.

Teacher Jessica Purdy reads with Preston Tanevski, a Year 1 student at St John's Catholic Primary School in Dapto, NSW, where the Reading Recovery program is used. Picture: John Feder
Teacher Jessica Purdy reads with Preston Tanevski, a Year 1 student at St John's Catholic Primary School in Dapto, NSW, where the Reading Recovery program is used. Picture: John Feder

A landmark study into the long-term effectiveness of the popular Reading Recovery literacy program has come under fire, after it emerged that data from a cohort of students was omitted from a final evaluation report.

Global consultancy KPMG recen­tly released a report on the impact of Reading Recovery in Britain, claiming children who took part in the program receive­d a significant boost in their senior school exams 10 years later.

The result of the study was seized upon by supporters of the program — which is widely used in Australia, particularly in NSW where it has been used in about 60 per cent of schools — as justifi­cation for continuing investment.

However, a Sydney think tank has questioned the report’s veracity, highlighting the omission of data from a comparison group of students who had not received the Reading Recovery intervention.

Centre for Independent Studies senior research fellow Jennifer Buckingham said the selective releas­e of data artificially inflated the achievement gap between stud­ents who received support and those who did not.

“The publication of misleading data is not an esoteric academic issue,” Dr Buckingham said.

“Governments and schools have spent … many millions on Reading Recovery, bolstered by research findings that purport to show a high level of effectiveness.”

Launched in 2005, the KPMG Foundation-backed “Every Child a Reader” project aimed to boost literacy through the rollout of Reading Recovery in schools.

Early evaluation of the project assessed 282 students divided into three separate cohorts: those who received Reading Recovery, those at participating schools who did not receive Reading Recovery, and students from non-partici­p­ating schools who did not receive the intervention.

The Weekend Australian has seen a draft report of the 10-year follow-up, dated July 2018 and written by UCL Institute of Educati­on professor Jane Hurry, that compares all three cohorts.

According to the report, the Reading Recovery students achieved significantly higher scores in the General Certificate of Secondary Education than stud­ents from schools that did not offer Reading Recovery.

However, the Reading Recovery students scored only marginally better than the comparison group of 49 students at their own schools, with the result not deemed “statistically significant”.

The published version of the evaluation, also written by Prof­essor Hurry, made no mention of the third cohort. It concluded that “the positive effect of Reading ­Recovery on qualifications at age 16 is marked … and suggests a sustaine­d intervention effect”.

Dr Buckingham disagreed with the claim, arguing that students at Reading Recovery schools, irrespective of whether they received the intervention, outperformed those at non-Reading Recovery schools, meaning that other factor­s, such as socio-demograph­ic circumstances, were at play.

KPMG spokesman distanced the foundation from the report and said ongoing involvement in the Reading Recovery project was “under review”.

Professor Hurry defended the preparation of two separate evaluation reports, claiming that the second report that omitted the comparison students was prepared to meet the guidelines for a journal article on the project.

She said KPMG made the ultimate decision to remove the comparison cohort from its published report.

“With hindsight we might have been better advised to stick to the original report, including the [comparison] group — all happened rather quickly at the end,” she said.

“My perception was that there was no intent to conceal or deceive.”

St John’s Catholic Primary School in Dapto, NSW, relies on Reading Recovery to help up to 20 per cent of students and has found it has substantially improved their reading and writing. Principal Andrew Heffernan said he was aware of a multitude of research but was guided by data collected from students at his school.

He said that many schools tried alternative interventions after the NSW Government ceased its $55 million-a year preferentially funding for the program in 2017.

“My understanding is that schools are now starting to return to Reading Recovery as some of those other interventions they’ve tried haven’t been as successful,” he said.

“I will say though that Reading Recovery doesn’t work for all children, in some cases other intervention is required.

Evidence for Learning executive director Matthew Deeble said research comparing outcomes for similar students that did and didn’t receive an educational program was a welcome development.

“We should be doing more of it, he said.

“But for the results to be trusted and the findings relied upon, transparency is essential.

“We need transparency in the design of the research — identifying the groups to be compared prior to trial and that will feature in the reporting. And we need transparency in the reporting to ensure free and open access to the findings and data, regardless of the outcomes. “

Evidence for Learning, which is backed by Social Ventures Australia, is pushing for the establishment of an independent evidence broker for schools that “rigorously and transparently commissions, runs and reports on trials of programs that matter to teachers and leaders and parents”.

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.theaustralian.com.au/national-affairs/education/doubts-over-reading-recovery-program-data/news-story/96a8c52eff27852b3d7fdf65dd039609