You are here

The problems with the latest effort to protect children online

The information Commissioner’s Office (ICO) has introduced an age-appropriate design code to protect children online. What’s not to like about that? 

Nothing, you might think. The aims of the code are admirable. But legislation has a habit of leading to unintended consequences – and there are reasons to be concerned this could be the case here.

Where are we now?

Online services currently collect data from children in all sorts of ways, via connected toys, their video choices and the platforms where they play with their friends. The companies behind these services know everything about their young users, from what time they get up in the morning to their moods.

The ICO's new design code aims to make it much harder for companies to use children as data machines. All organisations that provide services to children – or which may be used by children – have a year to comply with its 15 provisions.

These provisions are based on principles enshrined in the UN Convention on the Rights of the Child. According to the ICO, the code aims to:

  • Protect and support children’s health and wellbeing.
  • Protect and support children’s right to free association and play. 
  • Recognise the role of parents.
  • Recognise the evolving capacity of the child to form their own view.

So far, so good. The problems arise when it comes to implementation. 

What will it mean?

When children are online, the code demands that privacy settings be high, and geolocation services be switched off. Services shouldn’t use nudge techniques to prolong engagement – for example by suggesting children will be penalised if they leave.

Platforms and services aimed exclusively at children should find these, and the rest of the conditions, relatively easy to meet. They can collect only as much information as needed to provide their service, for example; they can make their terms and conditions understandable for children.

The difficulty comes with services that are designed for adults but are also used by children. (It is worth noting that the ICO defines children as under 18, so the usual get-out clause of treating over-13s as adults online doesn’t apply here).

These services could apply the code to all of their users (which is presumably what the ICO and its backers would really like). But the business model of the internet is advertising – and targeted, personalised advertising at that. We live, as Shoshana Zuboff points out, in the age of surveillance capitalism

For many platforms and services, implementing the code across the board would be financially disastrous. 

So what’s the alternative? The ICO suggests companies might ‘put additional measures in place to increase the level of age confidence’ – in other words, find out who your child-users are, then apply different standards to them. This could be done, they say, by using AI to track how users interact with the service, or by adopting third-party age verification services.

‘We don’t want to see an age-gated internet,’ the ICO claims. ‘What we want to see is a fundamental shift, where the internet and online services take a child-centred approach.’ 

How realistic is this?

Not very. There is a real danger that the code leads directly to an age-gated internet.

Why does that matter? If children (including older teens, let’s not forget) feel they’re getting a substandard service, they will resort to work-arounds. They may use Virtual Personal Networks (VPNs), which anonymise them and put them at greater risk. It happens all the time where censorship is rife: in Saudi Arabia, nearly one-third of internet users have used a VPN in the past month. Our work on skin gambling showed how easy it is for young people to circumvent laws designed to protect them.

They may move to the dark web. Not everything on the dark web is dangerous: it’s where your medical records are stored, for instance. But there is a great deal of illegal activity: drugs, weapons, sex tourism and self-harm chatrooms, for example. It’s not a place you want your child to spend a lot of time. On the dark web, everything happens out of public view, untraceably.

The ICO’s age-appropriate code has very little to say about content, other than standards of behaviour. But if you age-gate the internet, you could easily see a future in which you end up with a bowdlerised version of the internet for young people (some of whom are nevertheless old enough to vote, fight for their country and marry). 

Do the tools even work?

The code supposes that services could continue to collect data on adults while not collecting that of under-18s. This requires age-verification tools to work. They haven’t in the past. A long-awaited attempt to age-gate pornography in the UK was dropped after technical difficulties (it took two minutes to get round the barriers) as well as objections by privacy campaigners. 

Even assuming the tools worked, not all the provisions of the code can be managed as neatly as data collection. Enforcing behaviour suitable for children on a site mainly used by adults may be impossible (think of Twitter, for example). The easier solution is to shut children out of a service they were previously using.

So the devil is in the detail. It would be great if the age-appropriate design code worked. The principle is fine: we make separate arrangements for children all the time. In theory, the code could lead to a less invasive, less manipulative internet for everyone. 

Unfortunately, this seems unlikely. Much more probable is that it will lead to an explosion in age-gating tools and a shutting out of children from much of the online world – quite the opposite of recognising the evolving capacity of the child to form their own point of view that the ICO says it wants to promote.

The code’s intentions are admirable. Whether it will work in practice is debatable. An awful lot remains to be resolved. Until it’s clear that the code won’t make matters worse, we won't be putting out the bunting.

Image: andrey gonchar/stock.adobe.com


READ MORE

What can be done about gambling advertising to children?

OnlyFans – what parents need to know

The Rip-Off Games: How the new business model of online gaming exploits children