AI and student wellbeing: New opportunities for assessment and learning

AI and student wellbeing: New opportunities for assessment and learning

It is often noted that AI will revolutionise our everyday life. Behind the scenes AI is currently shaping the information we access, supporting medical decisions, providing restaurant recommendations, and filtering email. These AI enabled activities occur in both our private and professional life. However, within education the integration of AI has substantially lagged behind other sectors. Only a few examples exist of practical AI adoption and even fewer that are evidence-based and scalable across diverse educational contexts (e.g., from K-12 to lifelong learning).

The recent developments in Large Language Models, and particularly with the emergence of ChatGPT, has forced a change in the narrative around the use of AI in schools. Educators, school leaders, parents and other stakeholders are now confronted with the realities of AI in practice and its impact on writing, coding, teaching, and learning. The rapid adoption of ChatGPT has been fuelled by its easy accessibility, enabling students, teachers and parents alike to begin to experiment with AI and learn how such tools can be applied into teaching and learning practice.

Generative AI will substantially impact the practice of assessment. The ability for ChatGPT to create content from poems to essays, emphasises the importance for developing critical thinking and problem solving skills. In short this shifts the assessment model and narrative from the evaluation of an outcome (product - e.g. essay) towards an understanding of the learning and thinking processes that a student undertakes in the assessment task. Although this has been a focal point of learning analytics research for the last decade (Dawson et al., 2019, Joksimovic et al., 2019), there were limited easily accessible tools for both teachers and students to enable this transition

At present, much of the discussion around AI in education has centered on how we work with  a “thinking partner” for humans (Siemens et al., 2022). Yet, generative AI (and ChatGPT) also has the potential to better support student wellbeing at scale. For example, the language model is able to hold  discussions  with individuals allowing for exploration of a wide range of mental health and wellbeing questions in a safe and private context. While concise and accurate answers are still being assessed, a discussion agent may be more helpful for students than traditional web-based search in finding information or resources on these topics. Additionally, ChatGPT can generate tools and resources that students can use to improve their mental health, such as a meditation script or even a workout plan. While there are limitations to the technology, such as its inability to reference specific sources, overall, ChatGPT has the potential to be a valuable resource for students seeking information and support for mental health and wellbeing.

With any new technology, there are always growing pains. There are risks and ethical concerns that need to be addressed around data privacy and the potential for these systems to be used for nefarious purposes. When wellbeing and learning are the focus of ChatGPT use, the stakes are even higher and an active research agenda needs to be pursued. Potential impact of these systems on human communication and interaction, and whether they could lead to further isolation and disconnection, need to be empirically understood.

The measure of any technology of significance is its longer positive contribution to the human condition. Rarely is the final assessment one of a singular narrative of “good” or “bad”. Social media, as an example, has contributed tremendously to connecting individuals and shared communities globally, with heightened awareness of notable events occurring around the world while simultaneously staying connected to family and friends. The increased polarisation politically, and mental health impacts, of extensive social media use are only now beginning to emerge. Generative AI presents a similar challenge: tremendous positive opportunities with some worrying challenges. Identifying these challenges early through active research programs may help to mitigate the worst effects of AIs challenges.

This article was provided exclusively to The Educator by:

  • Dr Srecko Joksimovic, Senior Lecturer in Data Science at UniSA's Education Futures, Centre for Change and Complexity in Learning
  • Professor George Siemens, director of the Centre for Change and Complexity in Learning at UniSA’s Education Futures
  • Professor Shane Dawson, Executive Dean of UniSA’s Education Futures