Mishcon de Reya page structure
Site header
Main menu
Main content section
Children's data protection rights: a data protection casualty?

Children's data protection rights: a data protection casualty?

Posted on 28 January 2020

The UK's Information Commissioner has just published its final version of the "Age appropriate design: a code of practice for online services", having identified children's data as "a regulatory priority".  This code is the first of its kind in the world and represents a significant step in protecting children's privacy, further to the Home Office's proposal for a statutory duty of care between social media companies and children and the protections afforded to children under the General Data Protection Regulation (EU) 2016/679 ("GDPR").

GDPR mandates that data controllers provide certain information to data subjects, at the point at which personal data is collected from them. For those websites that target children (or those which do not, but are still likely to attract them as visitors), the bar is set higher: GDPR requires that the provision of privacy information must take into account that children have more limited awareness than adult visitors, and therefore will need a more accessible and clear framework. This sentiment is echoed in the newly published age appropriate code ("the Code") from the Information Commissioner's Office (the "ICO"), which "seeks to protect children within the digital world, not protect them from it".

Despite GDPR, the Code and growing scrutiny surrounding the use of children's personal data, it appears that, in very many cases, websites and apps popular with children are ignoring this requirement. This puts those who operate those sites, or who use them to sell or promote their services, at considerable legal and regulatory risk.

GDPR Regulatory framework

GDPR sets out a general approach to the treatment of children's data: "children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data" (Recital 38).

GDPR therefore identifies children as a particular group of data subjects requiring additional protection. Of course, concern about the online activities of children and their exposure to online risks is not new. In 2015, the Global Privacy Enforcement Network (an organisation of data protection authorities from around the world) carried out a sweep of websites and apps targeting, or popular with, children. It identified concerns with 41% of the websites and apps reviewed, including the levels of personal information collected and how it was then shared with third parties.

Alongside enhanced obligations towards children in relation to privacy policies, data controllers should also be aware that children enjoy additional protection in other areas under GDPR, for example, where businesses intend to use children's personal data for marketing or profiling purposes or where an individual asks for their personal data (provided when they were a child) to be erased. This article has a particular focus on the information requirements which apply to children, and so does not consider these additional protections further.

Children's privacy – a fast moving area

More recently, press reports have suggested that children are being "datafied from birth" and tracked by thousands of apps. In February 2019, the UK Children's Commissioner issued her proposal for a statutory duty of care between "Online Service Providers" and their young users, which was subsequently picked up by the Home Office in its White Paper on Online Harms. The White Paper recommends that a statutory duty of care be introduced, to be policed by an independent regulator who would be funded by industry. Companies will be required to demonstrate their compliance with this duty of care, including by designing products and services to make them safe for children.

The Council of Europe has issued recommended guidelines to Member States to respect, protect and fulfil the rights of the child in the digital environment. The first fundamental principle of these guidelines is that "in all actions concerning children in the digital environment, the best interests of the child shall be a primary consideration". This is reiterated by the ICO in the Code.

Whilst not directly addressing privacy concerns, the House of Commons Science and Technology Select Committee held an inquiry in 2018 on the impact of social media and screen-use on young people's health. During her evidence, the Children's Commissioner stated that not only are simplified terms and conditions needed, but "also the ability to report and know what to expect" from companies' use of children's data. There is therefore clear and growing opinion that protecting children's digital footprint and experience is of significant importance.

The ICO's "Age appropriate code"

The ICO's "Age appropriate code"
The newly published Code  provides practical guidance on the design standards the ICO will expect providers of online "Information Society Services", which process personal data and are likely to be accessed by children (i.e. under 18s), to meet.  An "Information Society Service" is defined as "any service normally provided for remuneration at a distance, by means of electronic equipment for the processing (including digital compression) and storage of data, and at the individual request of a recipient of the service" (Electronic Commerce (EC Directive) Regulations 2002). Examples include online shops, apps, social media platforms and streaming and content services. The ICO considers this definition covers most online services, even where the "remuneration" or funding of the service does not come directly from the end user. The Code is applicable where children are likely to access a particular service but not to all services that children could possibly access. Given this broad remit, concerns have already been raised about identifying whether you fall within the scope of the Code, although it has been reported that the initial focus of the Code is likely to be social media companies. 

The Code contains 15 cumulative and interlinked standards[1] of age appropriate design, including that settings should be "high privacy" by default (unless there is a compelling reason otherwise, taking into consideration the best interest of the child) and any parental monitoring controls should be made clear to the child. All standards must be implemented to demonstrate compliance with the Code and companies should take a proportionate and risk based approach when interpreting and complying with the 15 requirements. The ICO makes clear that the Code will still apply following Brexit, regardless of whether the UK leaves at the end of the transition period without a deal.

The first of the 15 standards is that the best interests of the child should be a primary consideration and that "it is unlikely…that the commercial interests of an organisation will outweigh a child's right to privacy". This is a bold and radical statement of which data controllers should be aware and is perhaps indicative of how seriously the ICO takes children's privacy. Failing to meet the data controller obligations towards children for fear of jeopardising commercial interests, or because it is too difficult to open up the black box of processing activities, is unlikely to be an acceptable justification.

Data controllers can meet this standard by taking into account the age of users, protecting and supporting their physical, psychological and emotional development and recognising the evolving capacity of the child to form their own view. The Code also recommends that data controllers use evidence and advice from third party experts to understand better children's needs and awareness. This evidence can also be helpful to data controllers when preparing their privacy policies, which may be read by children.

The Code is due to be put before Parliament and, once in force, affected companies will have 12 months to implement the necessary changes. The Code may be submitted as evidence in court proceedings, and the courts must consider it wherever relevant. Data controllers that ignore their obligations towards young data subjects may ultimately expect regulatory action by the ICO.

Whilst smaller organisations may feel that the Code is too onerous, the ICO makes clear in the Code that organisations can take into account their size and resources, when considering the risks to children and how to comply with the Code. The ICO will also consider the efforts made towards compliance during the transition period and plans to provide further guidance and support during the 12 month transition period.

Baroness Beeban Kidron[2], a children's rights campaigner and founder of the 5Rights Foundation (which seeks to articulate the rights of children in digital environmental), and the Council of Europe[3], have each noted that, as well as making clear to children how their personal data will be collected and used, data controllers must take into account that young people's maturity and attitudes towards risk will change as they grow older. Some will embrace risk, some will avoid it and others will simply not yet appreciate it. In essence, what a data subject consents to at 13 may be different to that at 15 and 18, with young people perhaps no longer wanting to share the data they readily provided during their younger teenage years. Policies must be adaptable to respond to children's changing needs in relation to the digital environment.

The UK's Children's Commissioner's "Who knows what about me?" report found that children between 11 and 16 years old post on social media, on average, 26 times a day – if they continue at the same rate, that is a total of nearly 70,000 posts by age 18. This is a huge amount of data that children are potentially unwittingly giving up on social media. Data controllers therefore must have systems and processes in place that allow them to update their young data subjects regularly on the data processing activities taking place and to allow them to change, or even erase, their digital footprint.

[1] These are: best interests of the child; data protection impact assessments; age-appropriate application; transparency; detrimental use of data; policies and  community standards; default settings; data minimisation; data sharing; geolocation; parental controls; profiling; nudge techniques; connected toys and devices; and online tools
[2] Are children more than "clickbait" in the 21st century? Baroness Beeban Kidron, Comms. L. 2018, 23(1), 25-30

[3] Guidelines to respect, protect and fulfil the rights of the child in the digital environment, https://rm.coe.int/guidelines-to-respect-protect-and-fulfil-the-rights-of-the-child-in-th/16808d881a

If we have children's consent – are we ok?

The UK's Data Protection Act 2018[4] provides that, if consent is being relied upon in relation to offering Information Society Services directly to a child, the child must be 13 or older for their consent to be valid. Where a child provides consent, the competence of the child (i.e. whether the child has the capacity to understand the scope of the data processing and the implications of the data collection and processing) must be considered. Arguably, consent is unlikely to be effective where there is complex information presented to children, or where they lack the ability to provide or withhold consent and still receive the requested services.

Companies must implement appropriate steps to verify the age of the child consenting. The exact age does not need to be confirmed, just that the child is old enough to provide his/her own consent. The Code provides a selection of ways in which companies can identify children's ages and companies are required to identify accurately children's ages with a level of certainty that is appropriate to the risks arising from the processing. Where companies are unable to identify children's ages with such certainty, they should apply the Code to all of their users (and not just children).

The ICO recommends that companies consider the degree of risk that the collection or use of the personal data poses to the child or others. Where the risk is low, and minimal information is being collected from the child (e.g. an email address to register for a newsletter), then asking the child to tick a box to confirm parental consent or his/her age may be sufficient. However, if more intrusive data processing activities are taking place (e.g. the child is posting personal data in an unmonitored chatroom), the ICO states that it will be necessary to verify that the child is old enough to provide his/her own consent or to check the identity of the person claiming parental responsibility and confirm the relationship between this person and the child.

Additionally, controllers must be able to show that consent to the specific processing operation taking place was "freely given, specific, informed and unambiguous"[5]. The challenge for data controllers offering services to children is, therefore, to communicate clearly and simply the extent of their processing activities in a way that children can understand. Children can therefore, where appropriate, consent to their data being collected, how it will be used and understand the consequences of providing their consent (including any associated risks of the processing).

Even where controllers do not rely on data subjects' consent for the data processing activities, but rely on another lawful basis for processing, data controllers must still provide certain prescribed information to data subjects. This information must be in line with Article 12 GDPR, which requires controllers to provide information to users in a "concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child." Of course, this is not a wholly new requirement – the predecessor data protection regime also mandated it, although in less explicit terms. Recital 58 provides further guidance: "any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand". The information provided should assist children or their parents to make properly informed decisions about whether to provide the information required to access the service and to continue to use it.

In all circumstances, regardless of whether controllers are relying on children's consent for the data processing activities, controllers must communicate their intended use of the children's data in a clear and easily understandable manner and must not exploit any imbalance of power between them and the children. Depending on the size of an organisation, its number of users and its assessment of the processing risks, the organisation may carry out user testing to ensure that information is provided sufficiently clearly for the age range in question. If user testing is not warranted, the reasons why should be documented in a data protection impact assessment. The Code also recommends specific "bite-sized" explanations" about how children's personal data is used, at the point that use is activated.

In addition, the Home Office's White Paper on Online Harms recommends that the proposed independent regulator introduces a code of practice that sets out, amongst other things, guidance about how to ensure that terms of use are adequate and understood by users when signing up to the service. It is therefore clear that the transparency requirement will remain a key obligation with which data controllers must comply.

[4] Section 9(a) Data Protection Act 2018
[5] Article 4(11) GDPR

Privacy Policy sample analysis

Unfortunately, it appears that, many websites and apps aimed at, and popular with, children do not appear to be meeting the clarity and simplicity standard. Our broad-brush review of five privacy policies for websites and apps popular with children[6] found that the length of those policies ranged from 3,889 words to 6,016 words. All privacy policies were presented in full form, with only one providing complementary short summaries throughout the (long) length of the policy. Of course, children are not alone in being faced with impenetrable privacy policies; the majority of privacy policies aimed at adults are also incredibly dense and hard to understand.

In particular, our sample of privacy policies revealed that the websites and apps collect (or said in their policy they collect) the following information from users, including children: substantial information on a user's device (including signal strength and battery level); information about third party applications installed on a user's device; information about a user's online and offline actions and purchases from third party providers; communications with other individuals; facial recognition data; the estimated geolocation of users; photographs of users (as provided to third party sites); and data in the user's device phonebook and camera rolls.

Whilst the processing of these types of personal data is not expressly prohibited under data protection law, children may not expect this level of personal data collection, nor have a full appreciation of how organisations will use such data. Such practice may even not be possible in future, given the Code's focus on encouraging default settings which ensure that children have the best possible access to online services and minimise data collection and use.

With some of the policies reviewed, this extensive data collection was only explained later on within what was already a lengthy policy. It seems unlikely that young data subjects would be reading the relevant provisions. The Code specifically requires data controllers to present all necessary information in a way that is likely to appeal to the age of the child who is accessing the service; this can include using diagrams, cartoons, graphics, video or interactive content that will attract and interest children. All of the policies reviewed provide their information in text form only.

Similarly, particularly intrusive data processing activities with potentially wide reaching consequences (for example allowing "shares", "likes" and replies to be publicly available and indexed by search engines, that any private "shares" or chat communications may be posted publicly by another user and that users can see when other users are active on the website/app) were generally not particularly prominent within the policies.

In relation to collecting information about children's communications with other individuals, the Council of Europe has specifically highlighted in its Guidelines that children have a right to private life in the digital environment which includes "respect for the confidentiality of their correspondence and private communications".  In our view, this particular form of data collection should be made more prominent to children.

Whilst not directly a transparency issue, the Code notes geolocation data as being of particular concern. It states that geolocation options should be set off by default, with controllers considering the level of granularity of location data that is needed to provide their services, and providing an obvious sign for children when location tracking is enabled. Options which make a child's location visible to others must default back to "off" at the end of each session[7].

The privacy policies we reviewed also contained overly legalistic language, drawn of course from GDPR itself: such as "withdrawal of consent", "legitimate interests", "Model Contractual Clauses" and "the EU-US Privacy Shield". They also included statements that the company relies on "legitimate interest" where it believes that its use of personal data "doesn't significantly impact your privacy" or there is a "compelling reason" for it to do so, and that the company may rely on consent "for a purpose that would otherwise be incompatible with this policy." Use of these technical legal terms and long, vague statements is unlikely to meet the requirements of the GDPR or the Code.

It has not gone unnoticed by the UK's Children's Commissioner who urged companies in its report to "put their terms and conditions in language that children understand". The Code provides guidelines for how controllers can present and tailor information according to the age of a child, warning that simplification of information should never be used with the aim of hiding how children's personal data is being used.  

It is therefore clear from our general review of the privacy policy samples that some companies are not explaining their data processing activities in a way which is likely to meet the transparency requirements. Given the complexities surrounding current data processing activities, data controllers could be forgiven for struggling to explain their activities in a clear and transparent manner to children. Over the past few years, data processing activities have become increasingly complex, with data being collected from various sources including from data subjects themselves, websites and apps. However, given the heightened scrutiny surrounding children's data, data controllers would be well advised to simplify their privacy notices in line with the GDPR and Code requirements.

[6]As this was meant only to be an overview, and not an academic study, we do not intend to name the organisations involved, and nor do we claim that the study was conducted with academic rigour.
[7] These requirements however do not apply where the core service offered cannot be provided without the processing of geolocation data.

Remember the virtual assistants

Children are not just interacting with websites and apps, but often also with "virtual assistants" being used in their homes. These virtual assistants may well be collecting children's data, forming profiles of children and combining children's data with adults' data in increasingly complex ways. The Code highlights connected toys and devices as requiring further consideration, as their scope for collecting and processing personal data is considerable. Children are also more likely, innocently, to give virtual assistants lots of information about themselves. They need a clear and simple explanation of what it means to provide virtual assistants, and similar technology, with their information. In 2017, Mattel cancelled its own virtual assistant for children amidst privacy concerns. There has recently also been growing debate over Amazon's use of children's data with the Amazon Dot Echo for Kids. Children and privacy activists have filed a complaint with the Federal Trade Commission, claiming that the Amazon Dot Echo for Kids is unlawfully recording children's conversations without their consent, alongside two private claims against Amazon on similar grounds.

In the UK, the Code suggests that, where a toy collects any video or audio generated by a child, this should be made explicit on a prominent part of the packaging or accompanying information. Data controllers should provide clear information about the use of personal data at the point of purchase and on set up of the device. This would likely require a privacy notice to be provided on an online check out page, as well as on the physical packaging and instruction leaflets. The Code recommends that any online information be provided without consumers having to purchase and set up the device, allowing them to make an informed decision about whether to purchase. Data controllers should also anticipate that their connected devices may be used by multiple users of different ages, in particular that children may use the devices unsupervised, and are advised to avoid any passive collection of personal data. These high profile concerns highlight the importance to data controllers of making the scope of their data processing activities clear to data subjects, particularly children and young people, and disclosing such activities as early as possible.

Again, given the complex way in which data controllers collect and use personal data, they could be forgiven for failing to provide this information clearly. However, this is unlikely to be a sufficient excuse for failing to comply with GDPR and the Code.

Is a new regulatory regime inevitable?

Whilst there has been no formal investigation in the UK into an organisation that has failed to communicate its processing activities clearly to children, the Children's Commissioner, in her "Who Knows What About Me" report, has issued calls for the UK government to monitor the situation and redefine data protection laws, if required, to protect children. The Children's Commissioner stresses that government, industry, regulators and other players will need to be able to respond quickly as we understand more about the impact of children's data collection. This mirrors the Council of Europe's suggestion that "each State should apply such measures as may be necessary to require that business enterprises meet their responsibility to respect the rights of the child in all their operations within the State's jurisdiction". Could these be the first signs paving the way to legislative solutions?

The ICO, in its Code, also alludes to a more active regulatory role in respect of children's data protection. The ICO seeks to monitor compliance with the Code through "a series of proactive audits", as well as considering any complaints. Failure to comply with the Code will render it difficult for controllers to demonstrate that their data processing is fair and in line with GDPR.

The Children's Commissioner suggested, during her evidence to the House of Commons Science and Technology Select Committee's inquiry, that a "digital ombudsman for children" be introduced to protect children online and "to give some balance and back-up to children". We therefore wait to see if this if this will be pursued by the UK government. 

Parents take note - it's not just companies that should be aware of children's rights!

Alongside the concern around companies' use of children's personal data is a growing awareness around the dangers of "sharenting". "Sharenting" is the recent trend of parents habitually using social media to share news, images and other personal information of their children. This is an increasingly complicated area, particularly where children are old enough to consent to their information being uploaded by their parents and arguably should have a say over whether particular information is made public.

A survey by VotesforSchools found that children were concerned about being embarrassed or the longevity of content which would remain online indefinitely. This in addition to the increased risk of personal identity theft, which is made easier through the sharing of children's online details; in fact, Barclays is predicting that "sharenting" will account for two-thirds of identity fraud facing young people by the end of the next decade and will cost £667m per year. Mishcon de Reya's conversations with children, exploring their views on sharenting, similarly identified concerns from children around parents' sharenting habits. Even celebrities are not immune to "sharenting" concerns, with Apple Martin publicly criticising her mother, Gwyneth Paltrow, for posting a photo of her online without her consent. 

Parents should therefore consider the privacy risks to their children, when considering making public their children's personal information.

Practical suggestions

As a general approach, data controllers should consider the perceptions, expectations and experiences of their young users, and acknowledge and respect that data controllers have the greater balance of power. Specific tips for organisations targeting young people include:

  • considering young people from the outset, when preparing privacy policies and undertaking data processing activities, and having the best interest of the child as a primary consideration when designing and developing online services; for example, where a child user elects to share some information, including a child friendly explanation of the consequences and risks of sharing;
  • using a data protection impact assessment to identify, assess and mitigate any risks to children (whether physical, emotional, developmental or material) and consider children's views in relation to proposed data activities;
  • including additional, specific "bite sized" explanations about how you use personal data at the point at which that use is activated;  
  • avoiding age verification methods that can be easily bypassed by children e.g. asking users to select their birth year or to tick a box (the ICO has recognised that the ability to verify age may be dependent on the availability of suitable technologies and that verification will become easier over time as the technology becomes more available. Companies should  regularly review available age verification mechanisms to ensure that they are using appropriate current technologies);
  • appreciating that, just because young people can access the services, does not mean that they can understand the implications of the services;
  • not preventing young users from using their services if young people do not agree to particular uses of their data;
  • having different versions of a privacy policy, where the intended audience covers a wide range. Where there is only one version of the privacy policy, it should be understood by the youngest age range;
  • having privacy settings set at the highest level by default, for example having the geolocation options off by default (unless a compelling reason can be shown for a different default setting, taking into account the interests of the child);
  • offering young people an ability to agree or withdraw their consent to specific uses of their data over time (for example, through a data processing control dashboard);
  • designing processes so that, to the extent possible, it is easy for a child to request the deletion of their personal data and for companies to comply with these requests where possible (a recent report has found that children wanted their social media profiles to be wiped clean at 18); and
  • providing information to young people in different and appealing ways, from short video clips to animations to icons (GDPR specifically suggests using icons).

Data controllers should appreciate that it will not be an easy task to communicate their (most likely) complicated data processing activities to children and young people. There are also bigger questions to consider: when is a child not a child anymore? how do we present risks to young people when they have different capabilities across their age bands? how should data controllers assess and prioritise threats to young data subjects? Data controllers should expect the analysis to develop as these societal questions are debated. In the meantime, given the increasing awareness around data collection, sharing and usage, data controllers providing services aimed at, or used by, young people, should consider the clear message of the 5Rights campaign that "a child is a child until they reach maturity, not only until they reach for their smartphone."

View a PDF of this report here.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else