New AI tool boosts student performance by nearly 50%

New AI tool boosts student performance by nearly 50%

Research shows that timely, high-quality feedback is a vital factor for student growth, but in the hustle and bustle of Australia’s modern classrooms, many teachers lack the time to give it.

Recognising this, Australia’s leading edtech provider found a way for teachers to give real-time, personalised feedback to students – both quickly and in a way that is leading to meaningful improvements to their learning.

Education Perfect’s new AI-powered feedback tool was recently trialled in more than 100 schools with a total of 15,000 students.

When the data was collected based on more than 200,000 responses, the results proved promising.

Students recorded an average 47% improvement in final response quality, while 69% with low-scoring responses demonstrated deeper understanding by their final attempt.

A staggering 87% of students reengaged with the AI tool to improve low-scoring responses, with the tool’s “learning loop” encouraging students to refine their responses.

Teachers involved in the trial said they were impressed with the quality of the AI tool’s feedback, reporting that it is “accurate”, “insightful”, and “amplifies their own impact.”

Below, The Educator speaks to EP co-founder Shane Smith about how the AI tool is boosting learning, supporting teachers, tackling equity gaps, and addressing ethical concerns.

TE: The trial results of EP’s AI-powered feedback tool are certainly encouraging! How will this tool evolve to support more complex tasks, such as fostering critical thinking or creativity in students, beyond improving response quality?

“Response quality” means a lot more than just regurgitating a singular “perfect” response. A well-designed question can absolutely engage and reward students’ critical thinking skills, and AI is flexible enough to provide useful feedback and guidance for these kinds of questions. For tasks where teachers want to focus on student creativity, such as creative writing, AI feedback can still help to ground these in the fundamentals of the art form (e.g. spelling, grammar, sentence structure), and teachers can layer their own reflections and guidance on top of this strong baseline. We’ve heard from many teachers that in this context, they’d far rather be providing feedback about how their students are developing ideas, building suspense, using figurative language, etc. rather than underlining the 10,000’th spelling or grammar mistake.

TE: What measures are in place to ensure that this tool not only complements teachers' roles but also avoids over-reliance on AI, especially in areas like emotional and social learning?

I think it’s important to view this tool in its broader context. It is meant to be one type of learning experience that teachers can deploy as part of a “balanced diet” of other activities and experiences, both digital and face to face. Teachers already have many years of successfully integrating EP into their classroom teaching and balancing it with other styles of learning activities. While AI expands on the capabilities of the platform, this process of curation and balancing by teachers remains much the same. One of the most exciting possibilities of AI tools like ours is the potential to reduce teachers’ administrative workload, allowing them to spend more time with their students. Our upcoming teacher insight reporting will provide AI-generated insights into common challenges their students are facing, helping to spark constructive conversations or deliver personalised extension and revision material

TE: In a climate of declining literacy and numeracy rates, how might this tool help address gaps in achievement between students from different socio-economic backgrounds?

One of the great promises of AI systems in education is the possibility to adapt to student learning needs. As part of our commitment to equitable access to this technology, starting in 2025, we will be progressively rolling out a selection of AI features to EP subscribers in government schools across Australia and New Zealand as part of their subscription package. This initiative will begin with schools in the lowest ICSEA brackets, allowing us to build and validate our capacity effectively. With EP currently used in 50% of high schools in Australia and 65% in New Zealand, we are excited about the positive impact this release will have on students from diverse backgrounds.

TE: Looking ahead, as the use of AI becomes more common in classrooms, how can school leaders balance the potential benefits of tools like this with ethical concerns about data privacy and equitable access?

The first step for school leaders is to develop a high-level understanding of what these concerns are and why they’re particularly topical with generative AI tools. The nightmare scenario here is that personal information entered into one of these systems is used by the company providing the AI service to train its AI models, and is then inadvertently regurgitated by the model when it is conversing with another user. Because these AI models thrive on large volumes of high-quality training data, there is a strong incentive for companies to push the limits and use whatever data they can get their hands on - not everyone is as scrupulous as we are. School leaders should demand both transparency and choice around where their data is being stored and processed, and what it is being used for. Protecting student data is EP's top priority, and our tool complies with the latest government guidelines for AI use, with ongoing updates to meet future standards.

Because it can be difficult for all school leaders to stay abreast of the latest sets of risks, Safer Technologies 4 Schools (ST4S) in Australia are rolling out a specific AI certification for companies which screens for these potential harms:  EP is one of the companies currently going through this certification with them and in the coming months having this badge will be a good indicator that companies are taking AI safety and user data privacy seriously.

Beyond student privacy, school leaders should also be starting to ask for proof of educational efficacy from the vendors of these AI tools - particularly for student-facing applications. These apps and plugins need to graduate from being novel experiences to become proven educational tools with validated effects in the classroom. Because many of these products are so new, there’s probably a grace period of around 12 months required for vendors to run the trials to demonstrate this efficacy. Certainly at EP we’re off to a running start with this, with fantastic initial results.