News / edTech National Pupil Database

Response to the ICO Code of Practice for Age Appropriate design

defenddigitalme welcomes the second stage of consultation on the ICO Age Appropriate Code of Practice which opens today after the initial views and evidence gathered last September, including our own submission.

The work behind the launch further included workshops and research with parents and children which expose some of the inherent human contradictions in some of the desirable features and trade offs parents and children may find reasonable in online interactions. The Code has made a very rounded attempt to take these into account.

At a glance, the Code intends to offer practical guidance about how to ensure online services appropriately safeguard children’s personal data and process children’s data fairly. It aims to influence the design of services to comply, and to be able to demonstrate they comply, with the GDPR and PECR [2003]. The standards in the Code are backed by specific data protection law, which is legally enforceable and already regulated by the ICO.

The Code will provide guidance on the design standards that the Commissioner will expect providers to meet, of online ‘Information Society Services’ (ISS), which process personal data and are likely to be accessed by children. “Information society services” has the same meaning as in the GDPR, so does not include preventive or counselling services.

Some of the detail, though not explicit, builds on topics already expanded on previously for what the GDPR should mean for everyone, either by the ICO or the Working Party 29. For example, specifically setting out that consent must not be bundled. “You should give children a choice as to whether they wish their personal data to be used for each additional purpose or service enhancement.” It also re-emphasises some of what already applies to all, such as under Article 30(1) of the GDPR a requirement to keep records of processing activities.

It has sixteen key areas of focus. Annex A provides some guidance on key considerations on what ‘age appropriate’ should mean at different ages. Annex B sets out in very simplified terms, the basics of the lawful basis for processing children’s data including an interesting separation of core and non-core activities. The section 15 on Data Protection Impact Assessment and template for Data Protection Impact Assessment are the most stretched part of the ICO Data Protection remit, but within the context of GDPR Recital 2.

While its focus is on explaining the expectations of existing law, it does also stray into new and more ambiguous territory in its guidance on more broad risk assessment of how the use of the service, and by default therefore data collection, may create risk for children. But even here, it draws on GDPR Recital 2, “This Regulation is intended to contribute to the accomplishment of an area of freedom, security and justice and of an economic union, to economic and social progress, to the strengthening and the convergence of the economies within the internal market, and to the well-being of natural persons.

On age, unsurprisingly, it relies (set out in Annex A) on parental permissions under thirteen, and reiterates that thirteen is the age at which children in the UK are able to provide their own consent to processing in the context of offering an online service directly to a child (by virtue of Article 8(1) of the GDPR and s9 of the DPA 2018). Age verification that relies on parental consent brings with it, its own risks for more data collection, but the Code seeks to mitigate this in a number of ways, including that data collected for this purpose, may be not be used for anything else.

While there is room for some ambiguity in the application of the description, of what is and what is not an ISS, the Code will err on the expectation of good practice by those handling children’s data online, and aligned with the whole of GDPR which applies to all. “This code also covers not-for-profit apps, games and educational sites, as long as those services can be considered as ‘economic activity’ in a more general sense. For example, they are types of services which are typically provided on a commercial basis.”

For clarity, consistency and confidence in the education sector, we still believe a specific Code for Education is required that goes beyond ISS data handling. The Code does not cover public authorities providing other types of direct public services online (as these are not normally provided for remuneration). Perhaps however, this Code will give an idea of what the ICO expects more broadly for anyone processing children’s data and therefore the expected standards even if outside this specific Code’s remit.

The direction of travel is to comply with GDPR and with Privacy-By-Design-and-Default. Age Appropriate Design will be the canary in the coal mine of how this should be put into practice not only for new developments, but to find and fix weaknesses in the “stack” of software that has evolved to support internet access in the past 40 years. Solutions which have relied upon such weaknesses and user exploitation, will have to adapt.

Consultation on this Age Appropriate Code of Practice is now open for responses until 31 May, 2019.


More detail on Age verification

As regards age verification (AV) the Code emphasises the importance of checking age and parental authority in a privacy-friendly way. It does not require all AV to be the same, but for it to be appropriate to the nature of the processing and risk, for example for some, asking users to self-select their age range is deemed sufficient. It suggests processing must, “Collect the minimum amount of ‘hard identifiers’ (such as passport scans or credit card details),” and “Remember that you need to comply with the GDPR in your processing of any personal data you collect for verification purposes, including the purpose limitation, data minimisation, storage limitation and security principles.

The vagueness here is a trade-off with flexibility for data processors to do so in fair and lawful ways that may be reasonably expected to comply, and document reasons for doing so.

Positively, it does not require every site to age verify every user in case some may be children. But the approach set out in the section 15 DPIA, step 6, is risk based. “If you identify a particular risk to children (or to children at a particular age range or developmental stage),you can consider using age verification to mitigate this risk, either to prevent access to the service or to better tailor the service to the needs of different age ranges. However, if there is a high risk and you cannot use robust age verification to distinguish children from adults (or distinguish between particular age-ranges), you need to apply any additional safeguards to all users to make sure you mitigate the risk to those children.”

It therefore does not infantilise every user nor expect providers of ISS to do so. Where AV is necessary, the Code recognises different methods for achieving it.

“If you are using a third party verification service, you should use ‘attribute’ systems which offer a yes/no response when asked if an individual is over a given age, or if a person holds parental responsibility over the child. If you can show that your processing is particularly low-impact and does not carry any significant risk to children, you may be able to show that self-verification mechanisms are reasonable (eg analytics cookies).”

The Code also specifies that, ‘you must not use data collected for age-verification purposes for any other purpose.’ How this would work in practice remains to be seen.


Risk Assessment and Online Harms

The Code also builds on the founding principles of Data Protection law, of necessity, proportionality and risk mitigation for all ages.

Where it goes beyond privacy and data protection law, is a step in the direction of the Online Harms White Paper. Section 15 of this Code, looks at the assessment of risks in the DPIA. “You should also specifically look at whether the processing could cause, permit or contribute to the risk of:

  • physical harm
  • online grooming or other sexual exploitation;
  • social anxiety,
  • self-esteem issues, bullying or peer pressure;
  • access to harmful or inappropriate content;
  • misinformation or undue restriction on information;
  • encouraging excessive risk-taking or unhealthy behaviour;
  • undermining parental authority or responsibility;
  • loss of autonomy or rights (including control over data);
  • compulsive use or attention deficit disorders;
  • excessive screen time;
  • interrupted or inadequate sleep patterns;
  • economic exploitation or unfair commercial pressure;
  • or any other significant economic, social or developmental disadvantage.

Furthermore, the detail in the leading principle of the Code, ‘the best interests of the child’, goes into issues of a child’s wellbeing and development. But, it says, this “does not mean that you cannot pursue your own commercial or other interests. Your commercial interests may not be incompatible with the best interests of the child, but you need to account for the best interests of the child as a primary consideration where any conflict arises.”


What’s missing?

While the DPIA template includes the question, “Does your service use any nudge techniques?” there is not as much on this in the Code itself as there could be. Its focus is based on profiling with the understanding that data are provided by the user, and less so in comparison with data generated about a user and their interactions. This could and should be stronger in support of Recital 71, around automated and semi-automated decision-making including profiling. There is nothing on machine learning and AI or product training, where it applies to ISS. But what’s there is a good start.

There is an unknown as yet in the suggested one-off timing in section six on default settings, “Reset defaults to high privacy for existing users.” It suggests that existing user settings should be reset [to the default high privacy settings] as soon as is practicable, “and in any case within [x] months of this code coming into force.” Beyond that users’ active choices, should remain the return-to settings, not default to maximum privacy each time.

It only covers the living. Recital 27 allows for Member States to provide for rules regarding the processing of personal data of deceased persons and we would like some provision to be made here for children, given the sensitive nature of cases and unknown future uses of personal data which may have implications for other family members.


More Detail on the Sixteen Areas of Focus

  1. Best Interests of the Child. This is broadly encompassing of children’s rights under the UN Convention on the Rights of the Child. (UNCRC) The UNCRC provides a framework which balances a number of different interests and concerns, with the intention of providing whatever is best for each individual child. In summary, the placing of the best interests of the child as a ‘primary consideration’ recognises that the best interests of the child have to be balanced against other interests. The Code recognises that although a provider of online service may not be directly subject to the UNCRC, they are subject to Article 5(1)(a) of the GDPR, and must take proper account of Recital 38.
  2. Age Appropriate Application.
  3. Transparency.
  4. Detrimental Use of data: industry codes of practice, advice, other govt provisions. This will be an interesting area of the Code which will be open to interpretation, and which could positively enhance children’s experience online. One suggestion is that, “You should also introduce mechanisms such as pause buttons which allow children to take a break at any time without losing their progress in a game,” but such mechanisms if done badly, have potential to become annoying interruptions and simply lead to workarounds and avoidance, sometimes meaning riskier routes.
  5. Policies and Community standards: Uphold your own published terms, policies and community standards. Again, while this is a broadly open area, the expected standard is being fair. “Again, if your systems aren’t adequate or you don’t keep to your promises then your original collection and continued use of the child’s personal data may be unfair and in breach of the GDPR.” Don’t be an asshat, and if you are, we’ll hold you in contempt of the Code and fairness under GDPR, seems to sum it up.
  6. Default Settings: must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child). Again, the scope here is for processing to be done sensibly with expectations of privacy by design and default. ‘You should retain any privacy settings that the user has applied’, may be a significant change however for some, even for big players whose systems need repeated updates of choices made over and over again today. We would hope that the other features in the Code do not mean that the suggestion here, “Profiles could be accessed via …. voice recognition technology for voice activated online services,” would become the unavoidable and only norm.
  7. Data minimisation: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  8. Data sharing: Do not disclose without a compelling reason to do so.
  9. Geolocation: Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation, taking account of the best interests of the child), and provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to off at the end of each session. The exception to this will be a grey area, if you can demonstrate “a compelling reason” for a geo-location option to be switched on by default, taking into account the best interests of the child.
  10. Parental controls: Broadly, this ties into a similar stance on geolocation. If you provide parental controls then you should provide age appropriate information so that the child knows that parental controls are in place. If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  11. Profiling off by default: “Switch options which use profiling off by default (unless you can demonstrate a compelling reason for profiling, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).” It’s not saying you can’t have profiling, but rather that any profiling needs to demonstrate it’s for the benefit of the child, not the data processor, and that it must happen in the context of the other principles, such as data minimisation, transparency, and accountability. The wording of some of this appears to apply more to services supplying automatically recommended content such as YouTube; “Your general approach should be that if the content you promote or the behaviours your features encourage are obviously detrimental, or are recognised as harmful to the child, in one context (eg marketing rules, film classification, advice from official Government sources such as Chief Medical Officers’ advice, PEGI ratings) then you should assume that the same type of content or behaviour will be harmful in other contexts as well. Where evidence is inconclusive you should apply the same precautionary principle.”
  12. Nudge techniques: not to be used to weaken privacy or extend use. There is an important recognition set out here, “Using techniques based upon the exploitation of human psychological bias in this way goes against the ‘fairness’ and ‘transparency’ provisions of the GDPR as well as the child specific considerations set out in Recital 38.”  All using such techniques today, take note.
  13. Connected toys and devices: effective tools to enable compliance. You need to comply with this Code if you provide a toy or device which collects and personal data and transmits it via a network connection.
  14. Online tools: To exercise and report concerns. This offers much needed clarity on the mechanisms that children should have available to exercise their rights.
  15. Data Protection Impact Assessment (DPIA): specific for children, age, development and capacity, and compliance.
  16. Governance and Accountability: This explains the duty on those processing children’s data to demonstrate compliance, to train Staff, and to document Policies and Procedures. It also emphasises the importance of this for senior management and at Board Level, requiring orgs report against the standards in the Code in any internal or external accountability reports.