How schools can make technology safer for kids to use

How schools can make technology safer for kids to use

For today’s younger generation of teachers, educational apps are about as commonplace and familiar as textbooks. However, for many teachers who began their placement before the advent of the Internet, the mass proliferation of digital technologies was a surreal, confronting experience.

Fast forward to 2022, and there seems to be as many edtech apps as there are students. Fortunately, most educators are able to blend technology into their jobs seamlessly, helping to manage their lessons and gain valuable insights into how their students are learning.

However, recent studies suggest the mass proliferation of edtech tools has some alarming implications for the digital privacy, and safety, of children.

Big tech in our schools: some worrying trends

A recent investigation that analysed over 160 educational apps and websites which were used in 49 countries including Australia during the COVID-19 pandemic revealed that four million Australian kids’ privacy may have been breached.

The investigation, by Human Rights Watch, indicated 89% of the educational technology (EdTech) products used may have put children’s privacy at risk by tracking their online activity and sharing their data with advertisers.

Professor Ganna Pogrebna, executive director of the Cybersecurity and Data Science Institute at Charles Sturt University, is a pioneer in behavioural data science. Her extensive experience includes being the Lead of the Behavioural Data Science strand at The Alan Turing Institute – the national centre for artificial intelligence (AI) and data science in London (UK) ─ where she is also a Fellow working on hybrid modelling approaches between behavioural science and data science.

“While we are still waiting for the Human Rights Watch to release the technical details of their study, Adobe Connect, Minecraft Education Edition and Education Perfect were named among those involved in the on-going EdTech controversy,” Professor Pogrebna told The Educator.

“Each of these tools has a different use and contains different potential threats, which in some cases may be alleviated to a considerable degree. For example, it is clear that in the modern world coding and programming skills are important and Minecraft is often used in classrooms to help students gain these valuable skills.  Yet, the question is: how is this tool used?”

Professor Pogrebna says Minecraft, for example, can be used offline, in which case no information about the student will be passed on to any third parties.

“It would appear that the balance has not tipped too far in Big Tech’s favour in the classroom yet. Schools in Australia actively encourage diverse classroom tools instead of adopting one computerised platform,” she said.

“At the same time, schools need to deal with the fact that computer-related skills are necessary in the modern Industrial Revolution 4.0 world.”

Professor Pogrebna said it must be acknowledged that schools are currently preparing people who will be retiring in 60 years’ time and that collaboration between human and algorithms in the future will be necessary.

“So, every school’s goal is not to prohibit or limit technology altogether, but to implement effective risk assessment mechanisms, which would adequately measure potential dangers of using different educational technology in classrooms,” she said.

“In a nutshell, the task of each school is to turn the EdTech Frankenstein into the EdTech Einstein, which would help our children get the necessary skills without giving up their human and digital rights.”

‘We must unlearn the habit of blindly trusting technology’

Professor Pogrebna says regulation should not only concentrate on privacy but cover all child data – not just the data collected through EdTech.

“There is much talk about regulation and governance in the technological domain, yet we should not rely on regulation alone,” she said.

“Each of us must train ourselves and our children to reflect on the various inputs that digital technology and algorithms are offering us as inputs into our decision-making process. Only through regaining this ability to stop and reflect will we ever be able to regain our independence as human decision-makers.”

Professor Pogrebna says that while there is a growing trend towards people learning more about how their personal data is used, they often fail to understand how much power algorithms have and how algorithms are influencing their decisions.

“For example, few people realise that social media platforms are shaped by algorithms and content that we and our children consume every day on those platforms is selected and promoted based on algorithmic logic,” she said.

“It is a matter of experience. As people get used to algorithms and gain more experience in dealing with algorithms, they will come to realise that what machines are offering to them is not necessarily the best outcome for them.”

Professor Pogrebna pointed out that machines need data to provide recommendations and the algorithms are only as good as the data that are used to train these algorithms.

“If people understand that, they will be able to make more informed decisions about whether algorithmic advice is good for them and our task is to educate ourselves and our children about the value of our personal data and the contexts in which our decisions may be affected by algorithms,” she said.

“As long as we are educated, we can influence our habits of trusting technology and prevent serious issues, making sure that our rights and freedoms are preserved.”

How schools and parents can make a difference

Professor Pogrebna says there are some important opportunities for communities to bring about meaningful change when it comes to children’s attitudes to technology.

“I think that we should not leave it to Big Tech companies to decide our fate. We will see a growing competition in the EdTech market. Yet, users of technology will also gain experience in interacting with technology,” she said.

“We should teach our children better digital hygiene – specifically, making sure that they do not give away their valuable data to unknown applications. According to my research, overwhelming majority of people in Australia as well as around the globe download applications without reading Terms and Conditions.”

Professor Pogrebna said that while this may be difficult to change at the individual level, greater education about the importance of digital hygiene can be a powerful preventative measure.

“If from the very young age we explain to children what they personal data are, how to protect their data, and, most importantly, why their data should be protected, in a few years’ time we will be in a much better position as a society,” she said.

“This could be achieved not only through regulation, but also via better educational programmes at school that draw attention to these issues, as well as via parent-child conversations about personal digital hygiene.”

 

The Human Rights Watch report cited in this article was published on May 25 and titled, ‘How Dare They Peep into My Private Life?’: Children’s Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic’.