DARK PATTERNS AND THE LAW: PRESENT AND FUTURE OF USERS’ AUTONOMY

An analysis of Dark Patterns under the EU consumer and data protection legal framework.

By creating manipulative menus, inventing obstacles to unsubscribing, using pre-checked boxes and specific colouring and graphics in the user interface, many online platforms influence users to make decisions that benefit them over users’ interests. These manipulative strategies are known as Dark Patterns and can be defined as “interface design choices that benefit an online service by coercing, steering, and/or deceiving users into making decisions that, if fully informed and capable of selecting alternatives, they might not make”. These patterns may be divided into different categories.

A 2020 study analysed 240 popular mobile apps and conducted an online experiment with 589 users on how they perceive Dark Patterns in these apps. The conclusion was that 95% of these apps contained Dark Patterns, which most users could not recognise. After being informed of the issue, users performed better in identifying malicious designs. This study shows that Dark Patterns are widely used, obscure, and powerful enough to prevent users from making informed choices.

Though these manipulative strategies already represent a relevant issue, the problem may become bigger in the future. The development of smart cities, the popularisation of smart homes, and the establishment of a metaverse can expose people to Dark Patterns more often, deeply affecting their autonomy and privacy. Thus, considering the impacts of Dark Patterns on users and the digital market, it is important to reflect on how the regulatory framework of the European Union (EU) addresses this matter.

Legal research on this topic can follow different perspectives. This Insight will focus on the EU data protection and consumer protection frameworks, particularly because Dark Patterns constantly affect consumer relations and involve the processing of users’ data. For example, when users agree to an online services’ terms and conditions that include the processing of personal data, both regimes apply, as the user will be both a ‘consumer’ and a ‘data subject’.

GENERAL DATA PROTECTION REGULATION (GDPR)

Most of the commonly used Dark Patterns intends to nudge users (i.e. data subjects) to consent to the processing of their data by platforms (i.e. data controllers). One example is the cookie consent pop-up shown on many websites, where the bottom ‘to agree’ is coloured, and the option ‘set your preferences’ (or ‘show purposes’) is bleached. Even when users escape this first design trap, they frequently find obstacles setting their privacy preferences, like confusing language and pre-checked boxes.

The GDPR requires personal data to be processed “lawfully, fairly and in a transparent manner in relation to the data subject” (Article 5(1)(a)): to be lawful, one of the six grounds provided in Article 6(1) of the regulation must apply to the processing; to be transparent, “any information and communication relating to the processing of those personal data [must] be easily accessible and easy to understand” (Recital 39, GDPR); and lastly, to be fair, the collecting and processing of personal data cannot occur through unfair methods, by deception or without the data subject’s knowledge. The fairness principle is key to rectify the imbalances between data subjects and data controllers.

To seal those principles, the regulation requires the implementation of privacy-by-design-and-by-default by controllers (Article 25 GDPR). This means that platforms must implement effective measures to comply with the requirements of the GDPR and to protect data subjects’ rights, and they must “ensure that, by default, only personal data which are necessary for each specific purpose of the processing are processed”.

However, as stated above, Dark Patterns are intentionally obscure to mislead users. Consequently, they violate the GDPR’s lawfulness, transparency, and fairness principles. Firstly, they put users in a disadvantageous position by preventing them from having access to clear and neutral information about the processing of their data, which is non-transparent and unfair. Secondly, although many times the platforms obtain users’ consent, which is one of the legal grounds provided in Article 6(1), such consent must be informed and freely given (Recital 32) to be valid. If the authorisation is obtained through Dark Patterns (as showcased in the example described above), it is unlawful. Finally, considering that Dark Patterns violate GDPR’s core principles, they are contrary to the privacy-by-design principle per se. Moreover, when they are used to induce users to authorise the processing of more data than needed for the provision of services (like pre-checked boxes), they also violate the privacy-by-default rules.

In brief, some Dark Patterns are used as an attempt to circumvent the data protection regulation and they usually (or always) fail to comply with GDPR’s principles.

UNFAIR COMMERCIAL PRACTICES DIRECTIVE (UCPD)

Although the EU consumer protection framework consists of many Directives that may be applicable to Dark Patterns, the UCPD is the one that has the strongest connection with these design strategies. The Directive emphasises the role of fairness in the pre-contractual environment by imposing conditions for information transparency and bans on practices that take advantage of the recognised shortcomings of consumers.

According to the UCPD, traders (i.e. platforms) must give consumers (i.e. users) enough accurate information to allow them to make an informed decision (that is, a decision based on facts). So, platforms must not use unfair, misleading, or aggressive trading practices (Article 5) to manipulate users’ decisions. To determine if a commercial practice is unfair, the UCPD uses the ‘average consumer’ as a benchmark. This concept represents a “reasonably well-informed and reasonably observant and circumspect” consumer (Recital 18). It is noteworthy that Dark Patterns are based on psychological mechanisms that affect most people’s behaviour without being noticed, and this will usually include the ‘average consumer’.

A case-by-case analysis is crucial to apply the UCPD, and different types of Dark Patterns, according to their design and effect, will fall within the scope of different categories of unfair commercial practices determined by the directive. For example, some platforms use Dark Patterns to induce users to agree with terms and conditions without reading them. This action is considered unfair under Article 5 of the UCPD because it is likely to distort the user’s decision regarding the signing of the agreement; and it constitutes a careless attitude towards users, which is contrary to the required professional diligence (Article 2(h)).

In general terms, Articles 6 and 7 of the UCPD establish that traders must not mislead consumers on aspects that are likely to have an impact on their ‘transactional decisions’. So, for example, a design strategy that omits material information from users may be considered a ‘misleading omission’ (Article 7) or even fall within the scope of No. 22 of Annex I of the UCPD (also known as “blacklist”). Instead, when platforms use design strategies to make it appreciably harder for a consumer to terminate or withdraw from the service, it constitutes an ‘aggressive’ commercial practice (Article 8 and 9(d)).

In short, a Dark Pattern hardly passes the UCPD fairness tests. If its design is not unfair under Annex I, neither misleading nor aggressive under Articles 6, 7, and 8, it will probably fail in passing the general test established in Article 5. That is because manipulating consumers’ choices, which is the intention behind Dark Patterns, is widely prohibited by Article 5 of the UCPD, and specifically prohibited by other articles of the directive.

FINAL THOUGHTS

Dark Patterns aim at manipulating users and are constantly present in the online environment. These design strategies reduce users’ autonomy and privacy and constitute a direct violation of the GDPR’s and UCPD’s core principles. Notably, the Digital Services Act brings some hope for a safer digital space. The current version of the text, which was approved by the EU Parliament in January this year, contains specific provisions that prohibit platforms to use Dark Patterns (Recitals 39a and 62). However, as exposed here, these manipulative strategies are already unlawful under the existent legislation. Therefore, to avoid repeating mistakes, it is important to understand why the current regulations were not enough to ban manipulative designs, and maybe one good starting question is: are Dark Patterns widely used due to the lack of specific legal provisions, or due to lack of enforcement?

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.