Caught by the age appropriate design code?

It's a bigger net than you might think...

Published on June 28, 2022 by Taceo Limited

What is the Age Appropriate Design Code (“Code”)?

The Code is a statutory code of practice developed by the Information Commissioners Office (“ICO”) that applies to a business that offers online products or services that involve the processing of personal data that is either targeted at or ‘likely to be accessed’ by children. Children are defined in the Code as being individuals under the age of 18.

The Code comprises a set of 15 standards (“Standards”) that businesses caught by the Code should be adhering to in order to ensure that the personal data of children using their services is subject to appropriate safeguards.

The ICO is keen to point out that the Code is not ‘new law’ it merely ‘sets standards’ and positions the GDPR ‘in the context of children using digital services’ – whilst that may be technically correct, the 15 Standards that organisations are required to observe do feel like additional legal obligations to be complied with not least as to ignore these would, in the event of an investigation or compliant likely be detrimental to your case or defence as application of the Code will be taken into account when considering whether an organisation has complied with its obligations under the GDPR and PECR.

The Standards are set out in summary form in the table below, the full detail can be accessed here. The highlighted Standards indicate those that clearly map to existing GDPR principles.

Each Standard is underpinned by a consideration of what is in the best interests of the child.

Why should my business take an interest in the Code?

“Likely to be accessed by children” is the reason that you might want to think about the Code in the context of an online service that your organisation provides. The Code isn’t confined to services targeted at children – the phrase ‘likely to be accessed’ means that its net reaches beyond ‘targeting’ – this is to ensure that services that children are using in reality aren’t excluded by the use of a very narrow definition. The trouble for many businesses however is that they may not know whether children (and let’s not forget that children can be anything from 0-17 yrs.) are accessing or likely to access their service if as a business, they do not intentionally target them. Code Standard 3 states that you must take a risk based approach to recognising the age of your users and that if you can’t ascertain users age with a level of certainty appropriate to the risk then you should apply the Standards to all users. When thinking about this Code and its potential impact, I was struck by the many apps (the prevalence of which have grown in number and use of the course of the pandemic) that might fall into the ‘likely to be accessed by’ category:

  1. fitness apps
  2. health and wellbeing apps
  3. food delivery apps
  4. ride hailing apps
  5. gaming apps

...were a few that immediately sprang to mind.

What practical steps can you take to assess whether the Code might apply to your business and comply with the Code?

Step 1

Assess whether the Code applies to your service. The guidance produced by the ICO includes a flow chart that you can work through if you aren’t sure whether you are covered by the Code. The flow chart is useful to a point however it does require you to answer the following question ‘Is your online service likely to be accessed by children? This will depend upon whether the content and design of your service is likely to appeal to children and any measures you may have in place to restrict or discourage access to your service’ – which many businesses may not have clarity on. In conducting your assessment, we would recommend that you consider:

  1. whether you make it clear that your service is for over 18s only and how this is achieved (e.g. can users simply self declare or do you have more complex method in place?)
  2. does your service allow others to be added by a primary (adult) user that would indicate that under 18s may be using it?
  3. does the interaction of your users with your service indicate that some of them might be U18?
  4. do you target U18s?
Step 2

If your analysis determines that children are using your service or (to paraphrase the ICO guidance) the possibility of children using your service is more probable than not then, (subject to the below), you will need to take steps to comply with the Code.

It is important to point out that if you establish that your service is likely to be accessed by children but that your service isn’t designed or intended for children then you should focus on ensuring that it cannot/is not accessed by children – the aim is not to make services that are not suitable for children child friendly.

Step 3

If you have established that the Code applies to you and you are comfortable with children accessing your service, you will need to assess your service and your current privacy policies and procedures in the context of the Code. Having clearly communicated and easily understood privacy policies and procedures that meet the key requirements of the GDPR are a necessary base from which to make any adjustments to comply with the Code.

In conducting your assessment, we would recommend that you complete a written (in order to meet your accountability requirements under the GDPR) review of your compliance against the Standards that results in a prioritised list of actions that are assigned to individual owners within your business. Any resultant amendments that you make to your service or the policies and procedures underpinning that service will need to be age appropriate. Grouped by age, the Code includes (at Annex B) some factors for consideration based on the age bracket that a child falls in to - for example, it defines 10-12 as ‘transition years’ and highlights that within this bracket, children’s use of online services are likely to ramp up and they are likely to be accessing services independently from their own device whereas those in the 16-17 bracket are described as approaching adulthood with technical abilities and skills that are more developed than their emotional intelligence.

Whilst starting your assessment may feel daunting – it is worth pointing out that most businesses won’t be starting from a blank sheet of paper. You may have already conducted a DPIA for your service which can be revisited to consider use by children. You should have a processing register that sets out the scope of personal data that you are collecting and the third parties that you are sharing this data with – that information will provide a base from which to consider the Data Minimisation and Data Sharing Standards. Potentially more challenging is a consideration of the Detrimental Use of Data Standard which is broadly drawn and requires a level of familiarity with government codes, regulation or advice that may not exist within your business. In recognition of this, the Code does suggest relevant reference materials (e.g. the OFT published principles for online and app based games) and urges caution when using ‘sticky’ features and emphasising that you should:

  1. avoid using personal data in a way that incentivises children to stay engaged;
  2. present options to continue playing or otherwise engaging with your service neutrally without suggesting that children will lose out if they don’t;
  3. avoid features which use personal data to automatically extend use instead of requiring children to make an active choice about whether they want to spend their time in this way (data-driven autoplay features); and
  4. introduce mechanisms such as pause buttons which allow children to take a break at any time without losing their progress in a game, or provide age appropriate content to support conscious choices about taking breaks.

Finally, if your service incorporates AI, it might be worth familiarising yourself with the draft EU AI Regulation. Once in force the AI Regulation is intended to provide a legal framework that ensures that that AI can be safely developed, brought to market and used within the EU.

The draft Regulation classifies AI systems into high risk and non high risk systems with detailed obligations and governance requirements attaching to the former but a lighter, more self-regulatory approach attaching to the latter. Both the Code and the AI Regulation recognise the fundamental rights of children enshrined in the United Nations Convention on the Rights of the Child and the AI Regulation incorporates explicit obligations where high risk AI systems are likely to be accessed by or have an impact on children. Whilst it is beyond the scope of this article to provide a deep dive into the draft AI Regulation it is worth highlighting that included in the list of high risk AI systems are those ‘intended to be used for recruitment or selection of natural persons, notably for advertising vacancies, screening or filtering applications, evaluating candidates in the course of interviews or tests’ therefore where you have young people using a recruitment service that uses AI that service will need to comply with the more rigorous provisions of the draft AI Regulation in addition to the Code.

Summary

The data processing that goes hand in hand with our use of on-line services and apps is firmly embedded into our lives and that of our children. Governments and businesses are working hard to ensure that our legal frameworks maintain pace and currency with the technology that we are using, however it is fair to say that this is work in progress especially when it comes to children. Research conducted by the Mozilla Foundation1 found that of the 32 mental health and prayer apps that they looked at, 28 raised strong concerns over user data management. In the Mozilla summary page highlighting their key findings, they point out that “Teens are especially vulnerable” noting “When teens share information on these apps, it could be leaked, hacked, or used to target them with personalized ads and marketing for years to come”. The Code is one tool that has been developed to ensure that children are protected when online. It isn’t perfect. There is limited practical guidance on conducting an assessment of your user base to analyse the ages of your users with the default being to apply the Standards to all users if you can’t be sure of users ages – or take steps to deter the U18s which (if you have no fixed idea on the % of users that U18s make up) may not be a commercially attractive approach however it is principles based and does provide commentary on each Standard though some additional, practical worked examples covering a range of sectors and complexity would have added helpful colour and depth to the application of the Standards.

If you would like further information on the above or wish to discuss other data privacy matters, you can contact us at [email protected].


1 Mozilla Foundation - Top Mental Health and Prayer Apps Fail Spectacularly at Privacy, Security