March 4, 2024

How can we make the Internet safe for children in practice?

Does regulating the Internet to protect children mean checking the age of all users? Sonia Livingston discusses the importance of taking a child rights-centred approach and highlights the political challenge of recognizing who a child is online when seeking to protect children.

One in three children uses the Internet and one in three Internet users is a child. However, technology companies say they cannot determine who is a child online. This is a practical but also political challenge: does society want to commission companies to evaluate the age of all their users? Or is it too big a risk for the privacy and freedom of expression of both adults and children?

Why children need protection online

Evidence of the problems abounds, with a plethora of studies topping media reports and also being cited in government investigations and third sector advocacy. While some sources are controversial and their implications are often exaggerated, the argument for a robust, evidence-based approach to protecting children online is widely accepted; hence the large number of new laws and policies being introduced in the UK, US and internationally.

The EU Kids Online research network carried out a pan-European survey on children’s online access, skills, opportunities, risks and safety mediation. For example, it was found that only a quarter of children aged 9 to 16 always feel safe online, and 10 percent never feel safe. Depending on the country, up to half of children said something bothered them last year, twice as many as in the previous survey. What bothers you? There is evidence that children face the 4 Cs of online risk: content, contact, conduct and contract. The most common risk is hatred; and the greatest increase occurs in exposure to self-harm content. It is also crucial that the risks of online interaction are unequal: the European ySKILLS project is finding that the most vulnerable adolescents are at greater risk of harm online, including those who are discriminated against or have poorer health.

A child rights-based approach

An authoritative statement of the United Nations Committee on the Rights of the Child is its General Comment 25, which sets out the implementation of the United Nations Convention on the Rights of the Child in relation to the digital environment. A child rights framework requires a holistic approach that balances rights to protection, provision and participation, and focuses on children’s age and maturity (or “evolving capacity”) and best interests (a complex judgment that puts children’s rights to earnings, and which requires consulting children and making transparent decisions).

While General Comment 25 is directed at states, the technology sector also has clear responsibilities for protecting children online. That’s why, alongside new legislation, “by design” approaches, including safety by design, are increasingly required from companies whose digital products and services impact children in one way or another. And there’s a lot they could do: for example, research from EU Kids Online shows that children don’t trust platforms or don’t know how to get help: after a negative experience, only 14 percent changed their privacy settings, and Only 12 percent reported the problem online. Meanwhile, less than a third of parents use parental controls, because they don’t know how they work or even if they are effective, and they fear adverse effects on children’s privacy, autonomy, and online opportunities.

Getting the policy framework right means adopting a children’s rights-based approach, as the UK and Europe have long committed to doing, but have not sufficiently enacted.

But society cannot hope to protect children by restricting their rights and civil liberties, nor through policies that, even inadvertently, incentivize companies to exclude children from beneficial digital services. Getting the policy framework right means adopting a children’s rights-based approach, as the UK and Europe have long committed to doing, but have not sufficiently enacted. While we wait, children, once the fearless explorers of the digital age, are becoming overly cautious, worrying about risks and, as evidence shows, missing out on many online opportunities as a result.


Sonia Livingstone analyzes children’s rights in a digital world. Click here to watch it on LSE Player.

Identify who is a child

But if companies don’t know which users are children, how can they be tasked with age-appropriate provision and protection? In the euCONSENT project, funded by the European Commission, my role was to explore the implications for children’s rights of age assurance and verification. The Information Commissioner’s Office is actively exploring these issues in the UK.

One option is that we expect tech companies to design their services as a widely accessible, child-friendly, and broadly civil space (like in a public park), with exceptions that require users to prove they are adults (like when purchasing alcohol in a shop). ). Another option is that we expect tech companies to treat all users in an age-appropriate way (i.e., find out everyone’s age and provide personalized services accordingly, although there is much to debate about what age-appropriate means and how to implement it, given that children vary greatly not only by age but by many other factors). While both approaches present challenges, policy innovation is vital if we are to move beyond the status quo of treating children online as if they were adults.

To realize children’s rights in the digital age, should society require companies to redesign their services as necessary or redesign the process of accessing their services? In both the UK Online Safety Bill and Europe’s Digital Services Bill, success will depend on carrying out effective and responsible risk assessments.

Currently, many age guarantee systems do not respect all the rights of the child.

In the euCONSENT project, we argue that a child rights-based approach to age assurance must protect not only children’s right to be protected from digital content and services that could harm them, but also their right to privacy and freedom. of expression (including exploring their identity or seeking confidential help without parental consent), their right to a prompt and effective child-friendly remedy, and their right to non-discrimination. This means they should be able to access digital services along with everyone else, even if they lack government identification, live in alternative care or have a disability, and whatever the color of their face. Currently, many age guarantee systems do not respect all the rights of the child.

let’s be practical

Despite significant arguments to date about the potential costs to privacy, expression, and inclusion associated with age-assurance technologies, in practice, users are already age-verified. Google says it has estimated the age of all users who logged into its service, based on a wealth of data collected over time, including what your friends look like, the sites they visit and everything you know about them. . But Google’s strategy in this case is not very transparent. Meanwhile, Instagram is one of a growing number of platforms adopting age estimation technology for all its users, as is Roblox.

At the Digital Futures Commission, with the 5Rights Foundation, we have proposed a Child Rights by Design model. It provides a set of tools for designers and developers of digital products and was developed jointly with them and with children. It is based on the Convention on the Rights of the Child and General Comment 25. It focuses on 11 principles of which age-appropriate service is one, privacy is another and also security, of course. The other eight are equally important for a holistic approach: equity and diversity; best interests; consultation with children; corporate responsibility; child participation; welfare; full development; and agency in a commercial world.

If big technologies incorporated Children’s Rights by Design, the task of policymakers, educators and parents would be greatly facilitated.


This publication is based on a speech given by the author in the framework of the European Commission’s stakeholder event on the Digital Security Law and the public hearing of the Committee on Internal Market and Consumer Protection (IMCO) of the European Parliament on the online safety of minors.

All articles published on this blog give the opinions of the authors, and not the position of LSE British Politics and Policy, nor the London School of Economics and Political Science.

Image credit: Shutterstock and Michael Jeffery via Unsplash

Leave a Reply

Your email address will not be published. Required fields are marked *