There are many wonderful learning and educational opportunities available online, and in an ideal world we create an online environment where children can thrive.


However, much of the content available online was designed with adults in mind and children can easily access content that isn’t deemed age-appropriate. As such, there is a growing consensus amongst the global population and governments that more needs to be done to ensure age-appropriate access to goods and services.


It is estimated that one in three internet users is under 18. As such, huge volumes of children and young people are being exposed to online environments which were designed for adults. A significant part of this issue is that many children know they can lie about their age online to gain access to platforms and content designed for older users. And vice versa, it is easy for adults with malicious intent to target platforms designed for younger users.

There is no one silver bullet when it comes to child safety, but a range of options exist to create age-appropriate experiences and protect children online


To tackle some of these issues, a raft of legislations across the globe are being introduced to improve online safety for children, such as the Online Safety Bill, the Digital Services Act and the California Age Appropriate Design Code Act. Some of the largest platforms, such as SuperAwesome, are already mobilising and exploring different age assurance options to obtain parental consent and protect children and young people.


But many more organisations have yet to fully understand the scale of the issue and start exploring how they’re going to tackle the challenges of the ‘four Cs’:

  • content
  • contact
  • contract
  • conduct between underage and overage users


It has to be said that there is no one silver bullet when it comes to child safety, but there are a range of options that exist to create age-appropriate experiences and protect children online and there’s really no excuse at all for platforms not to have started making inroads.


Regulations to have on your radar


The Age Appropriate Design Code


Introduced by the UK Information Commissioner’s Office (ICO) in 2021, the Age Appropriate Design Code is an amendment to the 2018 Data Protection Act, which requires online services to “put the best interests of the child first.” Whilst not law, the statutory code of practice gives the ICO powers to fine businesses that don’t comply with up to 4% of their global annual turnover or suspend their operations in the UK.


Much of the Code focuses on how children’s data is processed. It recommends high privacy settings as default and minimising the data collected. It goes further than other data protection laws, such as GDPR and COPPA, in that it also considers how products and features are designed in ways that can cause harm to children.


For instance, private chat functionality can result in children unwittingly talking to adults with malicious intentions, who groom or exploit children. Livestreaming can also expose children to inappropriate and explicit behaviour which is not age appropriate.


California Age-Appropriate Design Code Act


Across the pond, the California Age-Appropriate Design Code Act (the “Act”) is modelled on the UK’s Age-Appropriate Design Code. As reported by GamesIndustry.biz, it was signed into law on 15 September 2022 and takes effect on 1 July 2024. Jurisdictions around the world are also looking at child safety, including Australia, Singapore, India and Canada.

If you can’t tell who’s a child, you can’t protect them or act in their best interests


The Act places legal obligations on companies that offer online services that are likely to be accessed by children under 18, to undertake a Data Protection Impact Assessment (DPIA) for any online service, product, or feature likely to be accessed by a child. DPIAs shall address questions such as:

  • does the design of a feature harm children?
  • does it harm children with incentive or engagement features?
  • does it expose children to exploitation by harmful contacts?


Much like the UK’s Code, there are serious penalties for non-compliance. The California Attorney General will have the power to seek an injunction or civil penalty against any business that violates the provisions. Violators could be fined between $2,500 to $7,500 for each affected child. With 93% of children in the UK and 70% of children in the US playing video games, game developers are highly impacted by the different Children’s Codes.


Online Safety Bill


One of the most impactful pieces of legislation coming down the tracks is the UK’s Online Safety Bill. This piece of legislation will impose extensive duties on regulated companies to protect their users, especially children, from content that is illegal, harmful to children, and ‘legal but harmful’ to adults. The Bill will usher in a new era of online safety, regulation and accountability and the potential fines for non-compliance will be steep; up to £18 million, or 10% of annual global revenue.


Whilst the Bill is still making its way through Parliament (currently in its final stages), game developers should be thinking about whether your game or platform falls within scope. If your platform enables users to generate and share user content, contains chat functionality between players, and has players in the UK, you will be affected and you’ll need to take action.


The Digital Services Act


In Europe, there are also a number of pieces of legislation where age assurance and age appropriateness are referred to, including:


The Digital Services Act specifically contains an addendum which includes a strategy for a better internet for kids and a code of conduct. The strategy outlines plans and envisaged actions that will help to make digital services more age-appropriate and in children’s best interests. It’s well worth a read.


What does all of this mean for gaming companies?


In short, you need to get your house in order. You need to know the age or age range of the people accessing your platform, website or service. If you can’t tell who’s a child, you can’t protect them or act in their best interests. Once you know the age of users, you can adapt the content and work to deliver age-appropriate experiences.


With around 30% of gamers globally aged two to 18, game developers are highly impacted by the different pieces of legislation, and need to review how current games are accessed – as well as bear in mind the regulations for new developments.


This could involve:

  • having age-appropriate prompts to encourage players to take breaks
  • having the highest privacy settings as default for children
  • giving players age-appropriate explanations
  • having behavioural profiling turned off by default
  • offering age-appropriate terms and conditions
  • and ensuring voice chat is also turned off by default


On that, particular attention must also be paid to voice chat sections in games as these do expose children to significant risks – something that the Washington Post recently brought to the world’s attention with its exposé on how the Roblox voice chat allowed slurs and sex sounds to slip through and engaged in inappropriate behaviour with children.


In the US, the FBI are so concerned about sexual predators targeting children through online gaming platforms that they’ve launched a campaign called ‘It’s Not A Game’ which encourages parents to talk to their children about what they are doing online and who they’re talking to.


So how can you carry out age checks?


Many people believe that the only way to establish the age of a user is to ask them to use an ID document. However, over one billion people around the world do not own an ID document. It is also not practical to ask young people to use an ID document, given many young people are not old enough to have a driving licence and some do not own a passport.


Thankfully there are now a number of age assurance techniques that establish the likelihood that someone falls into a certain age or age range, without needing to verify their full identity.

Julie Dawson, Yoti


One such method is facial age estimation, which accurately estimates age from a selfie. It’s powered by an algorithm that’s learnt to estimate age in the same way humans do – by looking at faces. It detects a live human face, analyses the pixels in the image and gives an age estimate. Crucially it does not recognise anyone – there is no unique recognition or authentication and as soon as someone’s age is estimated, their image is deleted – protecting privacy at all times.


This technology is being used by the likes of SuperAwesome, which has integrated the technology into its Kids Web Services (KWS) tool. This tool enables developers to verify the identity of parents or guardians when granting their children permission to use features that collect personal information. This process, known as Verifiable Parental Consent (VPC), is needed to confirm that the parent giving permission is, in fact, an adult and ensures compliance with privacy laws.


Another option is with a Digital ID app. This allows individuals to verify their identity against a government-issued ID document, and then share specific details, such as a date of birth or an ‘over 13’ age attribute. Game developers can be confident they know the age of their players, without having to collect vast amounts of personal information. Individuals only need to verify their identity once, and then have an app they can use repeatedly across different gaming platforms whenever they need to confirm their age.

Online age verification is no longer optional. It’s a necessity


Other age verification options include a credit card check to verify if someone is 18 or over, or a one-time ID document check. Individuals scan their ID document and take a selfie, which is matched to the photo on their ID document. Just a data minimised ‘over 18’ or ‘over 13’ attribute can be shared with the organisation. In some countries checks to mobile phone operators or eIDs are also possible.


It is important that people are offered a choice in how they prove their age, to ensure age assurance is inclusive and accessible to all. People can then also choose the method which feels the most comfortable to them. Gaming platforms can see which options are most popular with their players, and switch to alternative methods if there are regulatory changes in a given country.


Online age verification is no longer optional. It’s a necessity in gaming to make sure players are given age-appropriate experiences. Regulations will hold platforms to account to ensure they are playing their part in keeping children safe online. With innovative and robust age solutions, platforms can keep players safe whilst giving them an incredible, but age-appropriate experience.

More articles on child protection and online regulation on The GamesIndustry.biz Academy

Julie Dawson is the chief regulatory and policy officer for Yoti, leading regulatory and government relations for the digital identity platform, developing policy approaches for fraud prevention and safeguarding, liaising with national and sectoral trust frameworks, in conjunction with Yoti’s internal and external ethics boards.