More recently, press reports have suggested that children are being "datafied from birth" and tracked by thousands of apps. In February 2019, the UK Children's Commissioner issued her proposal for a statutory duty of care between "Online Service Providers" and their young users, which was subsequently picked up by the Home Office in its White Paper on Online Harms. The White Paper recommends that a statutory duty of care be introduced, to be policed by an independent regulator who would be funded by industry. Companies will be required to demonstrate their compliance with this duty of care, including by designing products and services to make them safe for children.
The Council of Europe has issued recommended guidelines to Member States to respect, protect and fulfil the rights of the child in the digital environment. The first fundamental principle of these guidelines is that "in all actions concerning children in the digital environment, the best interests of the child shall be a primary consideration". This has been seconded by the UK's Information Commissioner in her proposed Age Appropriate Design Code.
Whilst not directly addressing privacy concerns, the House of Commons Science and Technology Select Committee held an inquiry on the impact of social media and screen-use on young people's health. During her evidence, the Children's Commissioner stated that not only are simplified terms and conditions needed, but "also the ability to report and know what to expect" from companies' use of children's data. There is therefore a clear and growing opinion that protecting children's digital footprint and experience is of significant importance.
The UK's Information Commissioner's Office (the "ICO") is working on its proposed "Age Appropriate Design Code", currently in draft form (the "Draft Code"), and is consulting with parents, carers and children to finalise it. The Draft Code will provide practical guidance on the design standards it will expect providers of online "Information Society Services", which process personal data and are likely to be accessed by children, to meet. An "Information Society Service" is defined as "any service normally provided for remuneration at a distance, by means of electronic equipment for the processing (including digital compression) and storage of data, and at the individual request of a recipient of the service" (Electronic Commerce (EC Directive) Regulations 2002). Examples include online shops, apps, social media platforms and streaming and content services. The ICO considers this definition covers most online services, even where the "remuneration" or funding of the service does not come directly from the end user. The Draft Code is applicable where children are likely to access a particular service, even if they represent only a small proportion of the overall user base.
The Draft Code contains 16 cumulative and interdependent standards of age appropriate design, including that settings should be "high privacy" by default and any parental monitoring controls should be made clear to the child. All standards must be implemented to demonstrate compliance with the Draft Code.
The first of the 16 standards is that the best interests of the child should be a primary consideration and that "it is unlikely…that the commercial interests of an organisation will outweigh a child's right to privacy". This is a bold and radical statement of which data controllers should be aware and is perhaps indicative of how seriously the ICO is looking to take children's privacy. Failing to meet the data controller obligations towards children for fear of jeopardising commercial interests, or because it is too difficult to open up the black box of processing activities, is unlikely to be an acceptable justification.
Data controllers can meet this standard by taking into account the age of users, protecting and supporting their physical, psychological and emotional development and recognising the evolving capacity of the child to form their own view. The Draft Code also recommends that data controllers use evidence and advice from third party experts to understand better children's needs and awareness. This evidence can also be helpful to data controllers when preparing their privacy policies, which may be read by children.
The Draft Code (a requirement of the Data Protection Act 2018) is expected to be published by the end of 2019. Under the Draft Code, the Information Commissioner must, when exercising her regulatory functions, take account of any of its provisions which she considers to be relevant. The Draft Code may also be submitted as evidence in court proceedings, and the courts must consider it wherever relevant. Data controllers that ignore their obligations towards young data subjects may ultimately invite regulatory action by the ICO.
Baroness Beeban Kidron, a children's rights campaigner and founder of the 5Rights Foundation (which seeks to articulate the rights of children in digital environmental), and the Council of Europe, have each noted that, as well as making clear to children how their personal data will be collected and used, data controllers must take into account that young people's maturity and attitudes towards risk will change as they grow older. Some will embrace risk, some will avoid it and others will simply not yet appreciate it. In essence, what a data subject consents to at 13 may be different to that at 15 and 18, with young people perhaps no longer wanting to share the data they readily provided during their younger teenage years. Policies must be adaptable to respond to children's changing needs and views in relation to the digital environment.
The UK's Children's Commissioner's "Who knows what about me?" report found that children between 11 and 16 years old post on social media, on average, 26 times a day – if they continue at the same rate, that is a total of nearly 70,000 posts by age 18. This is a huge amount of data that children are potentially unwittingly giving up on social media. Data controllers therefore must have systems and processes in place that allow them to update their young data subjects regularly on the data processing activities taking place and to allow them to change, or even erase, their digital footprint.
 These are: best interests of the child; age-appropriate application; transparency; detrimental use of data; policies and community standards; default settings; data minimisation; data sharing; geolocation; parental controls; profiling; nudge techniques; connected toys and devices; online tools; data protection impact assessments; and governance and accountability
 Are children more than "clickbait" in the 21st century? Baroness Beeban Kidron, Comms. L. 2018, 23(1), 25-30
Guidelines to respect, protect and fulfil the rights of the child in the digital environment, https://rm.coe.int/guidelines-to-respect-protect-and-fulfil-the-rights-of-the-child-in-th/16808d881a