Children’s Personal Data: The Achilles Heel of GDPR Compliance for Social Media Platforms

Playing an important role in children’s lives, social media platforms struggle to meet the GDPR special protection requirements for processing children’s personal data. What challenges need overcoming and how can compliance be enhanced?

Introduction

The European General Data Protection Regulation (GDPR) has significantly reshaped the landscape of data protection, highlighting individual privacy and empowering users with control over their personal information. While its principles are clear, compliance becomes a complex puzzle for social media platforms when processing children’s personal data.

A concerning revelation from UK’s Ofcom 2022 research demonstrated that 1 in 3 children lie about their age to access social media content. Notably, recent research from a Dutch VPN company found that one third of GDPR fines imposed on social media platforms were connected to the processing of children’s personal data. In a noteworthy case, Ireland’s Data Protection authority issued a substantial €345 million fine to TikTok last September for GDPR violations related to children’s accounts. The legal proceedings emphasised concerns with default account settings, family pairing, lack of transparency, and age verification. Notably, children’s accounts being set as ‘public’ by default and the family pairing mechanism, meant to link children’s accounts with an adult account to ensure supervision, proved problematic. The company failed to verify the adult accounts, leaving the child’s profile exposed once linked. Transparency issues were also raised, particularly related to the information provided to child users about default settings. Despite the proceedings, there was no conclusive determination on whether TikTok’s age verification methods were GDPR-compliant, highlighting the complexity of the matter.

Informed Consent

Obtaining valid consent is a basic principle of the GDPR. However, when relying on consent as the legal basis for processing children’s personal data, achieving truly informed consent becomes a multifaceted challenge for social media platforms. Although the GDPR (Art. 8) sets a minimum age of consent of 16, Member States can define lower limits, provided that not lower than the age of 13 when it comes to the offer of information society services directly to a child. For instance, in Portugal, Law 58/2019, which implements the GDPR in Portugal, allows the processing of children’s personal data based on consent if they are aged 13 or above. Thus, for children under this age limit, a parent or legal guardian needs to consent on their behalf to the processing of their personal data. As the GDPR only requires companies to make ‘reasonable efforts’ to verify that consent is truly given by a parent or guardian, it allows for great flexibility in the verification process. This is due to the limited availability of accurate, practical, and privacy-friendly methods for confirming parental consent. For instance, requesting proof of guardianship, like a birth certificate or a court order, would be disproportionate for information service providers, like social media. On the other hand, simply asking for the parent’s email address is an approach that can be easily circumvented.

In addition to the above-mentioned challenges, the ‘consent fatigue’ problem arises when parents, overwhelmed by a high volume of consent requests, tend to approve them all without thorough examination. As a result, guaranteeing the parent has indeed given its informed consent is often an unattainable goal.

Nevertheless, the difficulties of obtaining informed consent do not diminish when the child is old enough to provide it independently. The inherent complexity lies in the fact that children may lack a comprehensive understanding of the intricacies of data processing, making it hard to verify the true nature of their consent. Consequently, the content and language must be tailored to the comprehension level of the intended audience. This requires data controllers to navigate a fine line between ensuring clarity and accessibility without compromising legal accuracy.

Age Verification Dilemma

To align with GDPR requirements, there is an implicit need to verify users’ age. Content and platform providers must implement effective age verification mechanisms. Nonetheless, the selection of an appropriate age verification method can be challenging. In the context of AI technologies, age assurance tools are becoming more prominent in online services. Age assurance is a broad term that includes a variety of mechanisms such as age self-declaration, hard identifiers (such as passports), verification and age estimation through AI facial recognition systems. Simply asking for a birthdate cannot guarantee the information given is true; however, requesting an ID or adopting facial recognition technologies is the opposite of a privacy-friendly mechanism. Therefore, how can a data controller choose the best approach to comply with data protection laws?

The answer is there is no one-size-fits-all solution when it comes to age verification mechanisms. To be fully effective, any mechanism must be combined with a privacy-by-design framework. In addition, when processing children’s data, it is advisable to conduct a privacy impact assessment that will contribute to understanding the extent of the risks involved in this data processing.

Dark Patterns Exposed There is a growing presence of manipulative UX practices in online platforms. These are often called ‘dark patterns’ and they consist in techniques used to influence users to make choices that might not align with their true interests. They are particularly harmful to kids, who are often unaware of their privacy rights and implications, rendering them more susceptible to these deceptive tactics. Dark patterns can be used to encourage children to provide excessive information or to turn off high privacy settings.

Some of the most common dark patterns include: i) roach motel (making it easy to sign up, yet hard to exit an account); ii) forced continuity (subscription automatically continued after free trial expires); iii) preselection (manipulating users into unfavourable preselected choices); and iv) privacy Zuckering (coercing users to share more personal data than intended). Dark patterns can have serious consequences for children. By employing these deceptive tactics, platforms exploit children’s limited cognitive abilities, hindering their capacity to make well-informed decisions about privacy settings, data sharing and online interactions. Therefore, it is essential to address dark patterns to protect children and ensure ethical and sustainable user interactions.[ 1] 

Privacy-by-design, Security and Awareness

What does this mean? Children’s privacy must be considered in every step of product development and design for online services that target both children and adults. Since providers cannot always confirm the real age of the user, protective measures should be applied throughout the services. This means that online platforms must provide a clear and age-appropriate privacy notice for children and set strong privacy settings per default.

Moreover, organisations must implement robust security measures to prevent data breaches. These measures can include multi-factor authentication, data encryption, and access and parental controls. In the event of a data breach, timely notification becomes crucial, but this process becomes intricate when dealing with minors. Balancing the need for rapid action with the necessity of protecting vulnerable individuals adds an extra layer of complexity to compliance efforts.

Furthermore, compliance with the GDPR demands a collaborative approach from all stakeholders involved in processing children’s personal data. Educating parents, guardians, teachers, and children themselves about privacy rights, potential risks, and the responsible use of technology is an ongoing challenge. Bridging the knowledge gap requires sustained efforts from regulatory bodies and organisations to ensure a responsible use of technology and awareness of privacy rights and risks.

Conclusion As children are more vulnerable and less aware of the potential risks and outcomes of online activities, they are entitled to specific protection regarding their personal data. Balancing age verification, informed consent, data protection principles and ongoing collaborative educational efforts is crucial. Dark patterns further emphasise the ethical imperative of protecting children from manipulative practices. As technology advances, a commitment to safeguarding children’s privacy becomes vital. This commitment will help shape a digital environment that not only respects but actively protects the youngest members of society, thereby laying a strong foundation for the societies of the future.


 [ 1]Include reference to the different types of dark patterns.

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.