NAALA | Not An Average Legal Advisor

Protection of Children's health data in a digital world

Protection of Children's Health Data in a Digital World

Close up of a pediatrician having a check up on his patient

Anne Sophie Dil

Co-founder of NAALA

Published on 30 September 2021

The Dutch Consumentenbond and the Take Back Your Privacy foundation filed a €1.5 million claim against TikTok, and sued the company in late August 2021. Two weeks later, on September 14, the Irish privacy watchdog Data Protection Commission (DPC) announced that it had launched two investigations into alleged privacy violations by TikTok. At the heart of both the Consumentenbond’s claims, and the DPC investigations: violation of children’s rights under the General Data Protection Regulation (GDPR). 

Generation Z’s social media platform: TikTok. The Dutch National Social Media Survey showed that in 2020 the platform was used by 520,000 users in the 9- to 19-year-old category. The other (slightly more than the) half of the 1,125,000 Dutch users were populated by all other age categories. 

That’s not just a lot of young TikTok users. It can be seen as a reflection of the digital skills of the youngest generation. If we’d extent that to digital healthcare: how is it possible that most digital health solutions are explicitly not aimed at children? Well, that has everything to do with the first paragraph of this blog. 

Under the GDPR, children deserve specific protection regarding their personal data. In the online world, their personal data, such as their likes and habits, constitute the experiences they have and the development they go through. Combine this with the sensitivity of health data, which in general is often processed through digital health solutions. Practice shows that developers of digital health solutions wouldn’t touch children’s solutions with a ten-foot pole, as it seems impossible to comply with privacy legislation. 

The GDPR generally states that children deserve extra protection with respect to their personal data, but does not mention a specific age for this. In general, it can therefore be assumed that ‘children’ means anyone under 18 years old. The GDPR does provide conditions for consent by children with respect to online services. The child must be at least 16 years old to be able to give lawful consent to the processing of his or her data in the context of these online services. 

However, this age may vary from country to country: member states may adjust the age downwards (to a minimum of 13 years old). The Netherlands has not used this possibility to adjust the minimum age for providing lawful consent. 

In short: yes, the GDPR allows online processing of children’s data. However, this does require a legal basis.  

The GDPR distinguishes seven legal bases for processing personal data: 

  1. consent | In this regard, it is important to be sure that the child is able to understand what he or she is consenting to (read more about this in our earlier (Dutch) article). If the child is not able to do this, then the consent is not ‘informed’ and therefore not valid. This is the case anyway if the child is under 16 years old (see above). In these cases, the parent or guardian must give permission. 
  2. performance of a contract | Minors are, in principle, not allowed to conclude a contract without parental or guardian consent. For 16- or 17-year-olds, exceptions have been made in Dutch law, for example: concluding a medical treatment agreement with a medical specialist. 
  3. legal obligation | A key question is whether the processing of personal data is necessary. Alternatives to achieve the purpose must be considered. The interests of the child must be weighed against the interest of processing his/her personal data. 
  4. vital interests |This only applies if the processing of personal data is necessary to protect someone’s life. It only applies to matters relating to life and death. Again, consideration must be given to whether there are alternative solutions that are less intrusive to the best interests of the child. 
  5. public task | Virtually only public entities can rely on this basis. Consider, for example, Child Protective Services. 
  6. legitimate interest | Reliance on this basis entails an obligation to identify the risks to the child and to protect the child from them. The child himself may not be able to fully appreciate the risks and foresee consequences. 

Most online services for children require the consent of a parent or guardian for the processing of the child’s personal data. This specifically refers to consent for data processing by online services that are offered directly to the child. Online services that are not offered directly to the child are those that are offered through an intermediary such as school or health care institution. 

However, when the child’s personal information relates to their health, an additional condition applies. Health-related data are, in principle, prohibited from processing. Only if an exception applies may the data still be used:  

  • explicit consent | This is slightly beyond the “consent” basis. This explicit consent must be confirmed in a clear statement. 
  • employment, social security, and social protection law | For example, to establish right to social support related to the child’s health. 
  • vital interests| This corresponds to the “vital interests” basis. For example, emergency medical care, in which explicit consent is not possible due to unconsciousness. 
  • non-profit institutions | For example, churches. This involves very strict conditions. 
  • publicly available information | This exception only applies if the person has made the information public themselves. However, in the case of children, this does not apply. After all, children themselves do not yet oversee the consequences of such disclosure. If the parent or guardian of the child has made it public, one must consider whether it is (was) in the best interest of the child to first make this information public and then to process it further. 
  • legal claims | For seeking legal advice, for court proceedings or otherwise exercising or defending legal rights. 
  • compelling public interest | For example, national security, public safety, or the economic well-being of the country.
  • health or social care | This only applies if the personal data is processed by a professional who is bound by a professional secrecy obligation. 
  • public health | For example, pandemics, public vaccination programs, and clinical trials. 
  • archiving, research, and statistics | The processing must be necessary and in the public interest. 

In conclusion, online processing of children’s health data requires 1) a legal basis and 2) an exception to the processing prohibition. When considering the extent to which providers of online healthcare solutions can process children’s personal data, many of the bases and exceptions in the above lists drop out. At its core, it will often come down to the fact that processing personal health information online requires explicit consent from a parent or guardian. 

When providing services to children, the best interests of the child must be paramount. This, of course, also applies to providing online services to children. At a minimum, consider the needs of children when designing and developing a digital health solution. Where possible, involve children and their parents in the design of the solution. 

The general principles and obligations that follow from the GDPR for digital health solutions (for adults) also apply to solutions for children. However, due to their vulnerable position, children deserve specific, additional merits. 

At a minimum, the following considerations should recur during the design and development process of your digital health solution for children: 

1. Data Protection Impact Assessment (DPIA) 

Through a DPIA, a “data protection by design” approach can be adopted. It is a way of assessing and documenting whether, and if so how, specific data protection requirements are met. 

Performing a DPIA is mandatory when, among other things, children’s personal information is processed to offer online services directly to children. Also, the (large-scale) processing of health data leads to the obligation to perform a DPIA. 

While performing the DPIA, risks to the child are identified. These risks must be assessed, after which any mitigation measures can be taken.  

2. Privacy by default 

Every application must be privacy-enhancing. This means that the following settings are disabled by default, until they are enabled by the user: 

  • sharing children’s personal data with other users of the service, 
  • using children’s personal data beyond what is necessary for the core activities, and 
  • collecting more personal data of children, than necessary for the core activities 

If the settings are changed anyway, it should be possible to return them to the default setting at any time. At the end of the session in which the changes were made, the choice must be presented to either let the changes be permanent, or reset to the default. 

In the case of children, the explicit intention should not be to use nudging techniques to entice them to make choices that are detrimental to their privacy. Where possible, consideration should be given to nudging young children in the right direction. 

As the target population ages, children gain a better understanding of consequences of any choices they make. In these situations, an explanation of the risks may also suffice, and pro-privacy nudging is not necessary.  

3. Transparency 

For any processing of personal data, sufficiently clear information must be given to the user. Based on this information, the user must be able to understand that, how, and why their personal data is being collected. 

If children are the users, all information must be adapted accordingly. Since children themselves are not yet able to oversee possible consequences of data processing, clear and comprehensible information about these possible consequences must also be provided to the child. Please note that there can be considerable differences in the age categories of minors. For example, the information for a 5-year-old will need to be drafted differently than for a 14-year-old.  

Provide “bite-sized” explanations whenever you start collecting personal data. This can be at different times, for example by using pop-up notifications and push notifications. Make sure that the information in the notification is written in understandable language. 

Drawings and videos can be used to make the information understandable. If the intended user groups vary, it would be possible to include privacy statements of different levels in the solution. A complete and detailed privacy statement, along with the general terms and conditions, can be made available to parents (through a parent account or otherwise). 

Children are vulnerable, and so is health data. Children’s health data should be handled with care. In fact, in the digital world, one should try to teach children about their rights and privacy. 

However, it need not be the case that the generation that lives more online than offline has less access to digital health care as a result of developers’ fears of privacy violations. Quality and safety are at the uppermost in our minds, and a practical interpretation of these can ultimately lead to more appropriate pediatric health care. 

Would you like to discuss this further, or find out what we can do for you? Feel free to contact us

Please note that all details and listings do not claim to be complete, are without guarantee and are for information purposes only. Changes in legal or regulatory requirements may occur at short notice, which we cannot reflect on a daily basis. 

Other articles you may be interested in:

Liked the article? Maybe others will too. Feel free to share!