The rights of the child, XR technology and schools

In March 2021, as the Covid-19 pandemic raged and school students in many countries were adapting to online learning, the United Nations (UN) released “General comment No. 25 on the children’s rights in relation to the digital environment”. Drawing on an extensive international consultation process with children and a raft of expert submissions, General comment 25 provides guidance on how children’s rights should be fostered and protected in digital environments. This post outlines some key areas in General comment 25 in order to pose some thoughts on how they relate to the use of XR (eXtended Reality including augmented and virtual reality) technology in schools.

Before outlining these key areas, it is worth historically situating General comment 25. It is part of a children’s rights-based lineage from the UN adopting the Declaration of the Rights of the Child (1959) to the Convention on the Rights of the Child (1989) which recognised the social, economic, cultural and civil roles of children and setting a minimum standards for protecting their rights. Below is a poster version which provides a snapshot of the principles that underpin the Convention of the Rights of the Child. Nation state signatories to the Convention can be found here

UN+rights+of+the+child+teen+edition+A4

To return to General comment 25, the document begins by using the four principles from the Convention to provide guidance on children’s digital rights. The principles and some of my thoughts on their implications for XR in schools are outlined below:

  1. NON-DISCRIMATION“The right to non-discrimination requires States parties ensure that all children have equal and effective access to the digital environment in ways that are meaningful to them. States parties should take all measures necessary to overcome digital exclusion.” (p. 2).

Implications: All schools, not just wealthy ones, should be able to provide their students with continuous, equitable and meaningful access to XR learning technologies including the infrastructure (connectivity, bandwidth etc) that powers the tech. Teachers should be provided with independent, evidence-based professional learning opportunities and ongoing pedagogical support to assist them to integrate XR in ways that are most effective for learning across subjects and in integrated units of work. Digital divides are born in policy (and funding) failures, no more so than in the field of school education.

  1. BEST INTERESTS OF THE CHILD“States parties should ensure that, in all actions regarding the provision, regulation, design, management and use of the digital environment, the best interests of every child is a primary consideration” (p. 2-3).

Implications: Most countries are at an early stage of regulation governing XR technology and the development of ethical standards informing its design is also nascent. In the meantime, there are some existing frameworks such as safety by design, privacy by design and guidelines on automated decision making that schools should utilise to guide procurement and implementation. I realise this feels like yet another thing to learn and do beyond the core business of schooling; however, until there is strong regulation and industry-wide accepted ethical standards in place, it is perhaps the only way most teachers in most countries will be able to uphold the digital rights of the child.

  1. RIGHT TO LIFE, SURVIVAL AND DEVELOPMENT “Opportunities provided by the digital environment play an increasingly crucial role in children’s development… States parties should identify and address the emerging risks that children face in diverse contexts, including by listening to their views on the nature of the particular risks that they face…. States parties should pay specific attention to the effects of technology in the earliest years of life, when brain plasticity is maximal and the social environment…. Training and advice on the appropriate use of digital devices should be given to parents, caregivers, educators and other relevant actors, taking into account the research on the effects of digital technologies on children’s development … ” (p. 3).

Implications: . Teachers use their knowledge of child development everyday in the classroom. This knowledge about child development needs to be extended to include the potential effects of XR technologies on children and adolescents. There is no other technology like XR technology – It can make the user’s brain and the body feel as though they are in a totally different place, imaginary or actual, with real and computer-generated actors interacting in real time, for better and for worse. There is evidence that children have developed false memories after a VR experience. There are also child protection issues related to the use of VR equipment in classrooms and open social VR platforms. The current evidence base on the immediate and longer term effects of immersive technology on children is inadequate as very few studies have been conducted and there is more work required on ensuring research with children using XR technology is ethical. Most manufacturers of VR headsets provide health and safety information and suggested age limits; however, like Terms of Service and company privacy policies, these are often not read or skimmed over. There is a great deal of work to be done by both government and industry in developing plain English and child-friendly policy related to technology risks including but not limited to privacy issues. In the digital sphere of education policy and in industry, there are either opaque or non-existent accountability mechanisms to query or contest data extraction and use, and third-party data interests, or to seek redress if something goes wrong. There is significant work to do if children and their parents/caregivers are to be given a voice and ways to effectively exercise rights in the digital learning space generally and with XR specifically.      

  1. RESPECT FOR THE VIEWS OF THE CHILD – “When developing legislation, policies, programmes, services and training on children’s rights in relation to the digital environment, States parties should involve all children, listen to their needs and give due weight to their views. They should ensure that digital service providers actively engage with children, applying appropriate safeguards, and give their views due consideration when developing products and services.” (p.3-4).

What are the views of children on the digital environment including XR technologies for leisure and learning? How do schooling systems and teachers amplify these voices for good transparent policy development and to inform classroom practice? How can schools engage in critical conversations with technology companies and ask the right ethical and educational questions about EdTech to seek evidence of effectiveness for learning and to advocate on behalf of children especially when so much of schooling has become platform dominated (often one-platform dominated)? Why is there a dearth of independent professional learning on digital technologies available to teachers?  It is fair to say that these are generally unanswered yet vital questions that deserve more than lip service from state education authorities and those in charge of schooling systems. The proliferation of digital literacy curricula is a good place to start classroom conversations. In case you are interested, here is a child friendly version of General comment 25 that can be used in class.

It is worth ending this whirlwind tour through some sections of General Comment 25 by highlighting section 42 of the document that specifically related to XR technologies:

“States parties should prohibit by law the profiling or targeting of children of any age for commercial purposes on the basis of a digital record of their actual or inferred characteristics, including group or collective data, targeting by association or affinity profiling. Practices that rely on neuromarketing, emotional analytics, immersive advertising and advertising in virtual and augmented reality environments to promote products, applications and services should also be prohibited from engagement directly or indirectly with children.” (pp.7-8).

There is a lot to unpack in this paragraph. Here are some key points to consider. The intersection between XR and artificial intelligence (AI) has hastened the harvesting of highly identifiable data from people’s bodies known as biometric data. This is harvested using the tracking and sensors built into XR hardware and software products and represents a significant privacy risk to users of the technology including children. Data can be (and is) being collected through the tracking of limb, head and finger movements, gaze patterns and pupil dilation as proxy measures for attention, facial expressions, speech and written communication, geolocation sensors, and information about the surrounding environment captured via pass-through camera technology in headsets. As boring as it seems, it is well worth reviewing the privacy policies of XR software and hardware companies. For example, check out Meta’s supplementary privacy policy, which also has a separate eye tracking policy embedded into it, to get a sense of the degree of biometric data harvesting and potential sharing of this with third parties.

The thing about biometric data is that is so personal that it can be used to identify individuals and settings. While the privacy implications of this for adults is serious, the implications for children and schools is even more concerning. In many countries and jurisdictions there is weak regulation around biometric data collection, storage, use and commercial currency for third party transactions (selling on bodily information)  despite its sensitivities. In addition, the use of that data, linked to other information collected via multiple platforms and online interactions, for surveilling, unfairly profiling, and manipulating or ‘nudging’ people’s emotional states and behaviour, covertly and overtly, raises serious ethical issues especially for vulnerable populations such as children. Hence, General Comment 25 specifically identifies virtual and augmented reality technology as representing a special class of risk to children. If you want to learn more about the ethics and implications of AI-powered biometric and affective computing applications for schools, check out the ethical framework for education contained in this report.    

Now is the time that teachers, educational policy makers, researchers and industry need to have serious conversations WITH children and their parent and caregivers about the digital rights of the child broadly and especially in relation to unique challenges emerging technologies that XR and AI bring. But conversations will not be enough. Consultation and engagement need to be accompanied by practical educational, accountability and regulatory initiatives if the digital right of the child are to be endorsed and celebrated in schools.

This post bought to you by A/Prof Erica Southgate.

Cover image by https://oscaw.com/art-camp-week-2-lets-make-eyes 

One thought on “The rights of the child, XR technology and schools

Add yours

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: