Notice

Please be advised you are about to leave the Standards Australia website to proceed to the AustLII website. Click OK to proceed.

Virtual fight for kids’ safety: Standards Australia releases landmark draft standard to protect children in the Metaverse

December 3, 2024

Statements

Guidelines to engage industry, government, and support families as new research shows low awareness of the Metaverse and how children are engaging with it.

Standards Australia has released a proposed standard aiming to address the safety of children in the Metaverse—an innovative yet complex virtual environment that’s quickly emerging as the next online frontier but can be fraught with risks to children including online bullying, grooming, and identity theft.

The draft standard, DR AS 5402:2024 Children's safety in the Metaverse, aims to provide a practical framework for advancing children’s safety in the metaverse with a focus on privacy and accessibility

The launch comes at a critical time as new research of 1,000 parents and 300 teachers by Standards Australia shows alarming gaps in public awareness: while two-thirds of Metaverse users are under the age of sixteen, a significant 83% of parents and 70% of teachers either don’t know or are unsure what the Metaverse actually is, 30% of parents admit they have no idea who their kids interact with in virtual worlds and only 44% of parents are aware of the risks associated with Metaverse use for children.

Despite promising benefits in education and entertainment, with 75% of Australian teachers saying their pupils use the Metaverse for educational purposes, the Metaverse still presents significant risks, with parents calling for change.

Research from Standards Australia echoes the desire for action with 94% of Australian parents wanting standards and guidelines in place to make the Metaverse safer for children.

To drive change, the draft standard’s release provides guidance to business and industry to make safety the top priority and build appropriate layers of protection and reporting. It also highlights the need for collaboration between Metaverse platform developers, policymakers, and families to best create safer online environments for children.

The standard is currently open to public comment and Standards Australia is encouraging those interested to provide feedback (until 24 January 2025) so it can be published in 2025.

Kareen Riley-Takos, Chief Operations Officer at Standards Australia said, “Whilst a new concept to many, the Metaverse is already an established entity, with many positives and the incredible potential to shape how we engage with each other.”

“Yet the Metaverse is not just a playground that offers educational and entertainment potential; it is also sadly fraught with real-world risks. From bullying to identity theft, we need to ensure children’s safety is embedded into these digital platforms from the ground up,” said Riley-Takos.

“What's crucial now is ensuring our future virtual spaces aim to protect and support the most vulnerable, our kids, through thoughtful design, guardrails, education, and consistent safety standards."

Kirra Pendergast, a leading international expert in online safety and Founder and Director of Safe on Social Group has seen the potential dangers of the Metaverse first-hand and actively campaigns for platform reform as well as educating children, parents, teachers and business.

“Standards Australia’s draft standard for protecting children in the Metaverse couldn’t be timelier. As the government has legislated to ban certain social media platforms for children under 16, it’s crucial to recognise that many of these same children are spending their time on Metaverse platforms,” said Pendergast.

“At Safe on Social, we’ve seen firsthand how platforms can expose children to online predators, financial scams, and content that is far from age-appropriate, under the often misleading ‘All Ages’ label.”

“With Standards Australia stepping in to set a benchmark for child safety in these environments, the hope is that developers will prioritise safety by design, not as an afterthought. As parents, educators, and policymakers, we need to work together to ensure the metaverse is safer for young people, especially as they increasingly move from traditional social media into the immersive world of the Metaverse,” said Pendergast.

Jocelyn Brewer, a leading voice in cyberpsychology and digital wellbeing parent education says that there are ways to use technologies like the Metaverse safely, but they must be safe by design and educating children on safety is paramount.

“Children naturally learn through social interaction and play, and digital spaces like the Metaverse can be valuable environments for this kind of development. While we can and should teach children about online risks - just as we teach them to be street-smart in the physical world - there's only so much parents and educators can do if the platforms themselves aren't built with safety as a fundamental priority,” said Brewer.

“Teaching digital safety is like teaching water safety. We supervise kids in the pool, guide them through different conditions, and never let them swim in unknown waters alone. But just as pools need proper fencing and depth markers, online spaces need built-in safety features. Right now, it's like we're asking children to swim in pools without basic safety measures in place.”

Standards Australia is offering the opportunity for public comment on the newly released standard with the draft open for comment and input via the Standards Australia website.

The organisation understands the complex terrain of virtual worlds can be daunting for families and educators alike. Alongside the draft standard, Standards Australia has also provided guidance for parents and caregivers on how to effectively safeguard children online in Appendix A of the standard.

Contact
Communications Department
Virtual fight for kids’ safety: Standards Australia releases landmark draft standard to protect children in the Metaverse
Email and link here
Young boy wearing a metaverse headset. A woman and a man are sitting on the couch in the background.
Guidelines to engage industry, government, and support families as new research shows low awareness of the Metaverse and how children are engaging with it.

Standards Australia has released a proposed standard aiming to address the safety of children in the Metaverse—an innovative yet complex virtual environment that’s quickly emerging as the next online frontier but can be fraught with risks to children including online bullying, grooming, and identity theft.

The draft standard, DR AS 5402:2024 Children's safety in the Metaverse, aims to provide a practical framework for advancing children’s safety in the metaverse with a focus on privacy and accessibility

The launch comes at a critical time as new research of 1,000 parents and 300 teachers by Standards Australia shows alarming gaps in public awareness: while two-thirds of Metaverse users are under the age of sixteen, a significant 83% of parents and 70% of teachers either don’t know or are unsure what the Metaverse actually is, 30% of parents admit they have no idea who their kids interact with in virtual worlds and only 44% of parents are aware of the risks associated with Metaverse use for children.

Despite promising benefits in education and entertainment, with 75% of Australian teachers saying their pupils use the Metaverse for educational purposes, the Metaverse still presents significant risks, with parents calling for change.

Research from Standards Australia echoes the desire for action with 94% of Australian parents wanting standards and guidelines in place to make the Metaverse safer for children.

To drive change, the draft standard’s release provides guidance to business and industry to make safety the top priority and build appropriate layers of protection and reporting. It also highlights the need for collaboration between Metaverse platform developers, policymakers, and families to best create safer online environments for children.

The standard is currently open to public comment and Standards Australia is encouraging those interested to provide feedback (until 24 January 2025) so it can be published in 2025.

Kareen Riley-Takos, Chief Operations Officer at Standards Australia said, “Whilst a new concept to many, the Metaverse is already an established entity, with many positives and the incredible potential to shape how we engage with each other.”

“Yet the Metaverse is not just a playground that offers educational and entertainment potential; it is also sadly fraught with real-world risks. From bullying to identity theft, we need to ensure children’s safety is embedded into these digital platforms from the ground up,” said Riley-Takos.

“What's crucial now is ensuring our future virtual spaces aim to protect and support the most vulnerable, our kids, through thoughtful design, guardrails, education, and consistent safety standards."

Kirra Pendergast, a leading international expert in online safety and Founder and Director of Safe on Social Group has seen the potential dangers of the Metaverse first-hand and actively campaigns for platform reform as well as educating children, parents, teachers and business.

“Standards Australia’s draft standard for protecting children in the Metaverse couldn’t be timelier. As the government has legislated to ban certain social media platforms for children under 16, it’s crucial to recognise that many of these same children are spending their time on Metaverse platforms,” said Pendergast.

“At Safe on Social, we’ve seen firsthand how platforms can expose children to online predators, financial scams, and content that is far from age-appropriate, under the often misleading ‘All Ages’ label.”

“With Standards Australia stepping in to set a benchmark for child safety in these environments, the hope is that developers will prioritise safety by design, not as an afterthought. As parents, educators, and policymakers, we need to work together to ensure the metaverse is safer for young people, especially as they increasingly move from traditional social media into the immersive world of the Metaverse,” said Pendergast.

Jocelyn Brewer, a leading voice in cyberpsychology and digital wellbeing parent education says that there are ways to use technologies like the Metaverse safely, but they must be safe by design and educating children on safety is paramount.

“Children naturally learn through social interaction and play, and digital spaces like the Metaverse can be valuable environments for this kind of development. While we can and should teach children about online risks - just as we teach them to be street-smart in the physical world - there's only so much parents and educators can do if the platforms themselves aren't built with safety as a fundamental priority,” said Brewer.

“Teaching digital safety is like teaching water safety. We supervise kids in the pool, guide them through different conditions, and never let them swim in unknown waters alone. But just as pools need proper fencing and depth markers, online spaces need built-in safety features. Right now, it's like we're asking children to swim in pools without basic safety measures in place.”

Standards Australia is offering the opportunity for public comment on the newly released standard with the draft open for comment and input via the Standards Australia website.

The organisation understands the complex terrain of virtual worlds can be daunting for families and educators alike. Alongside the draft standard, Standards Australia has also provided guidance for parents and caregivers on how to effectively safeguard children online in Appendix A of the standard.

Contact
Communications Department
communications@standards.org.au
communications@standards.org.au