In 2022, the average daily social media usage worldwide amounted to 2.45 hours per day per user. Gamers spent 8.45 hours on average per week playing video games in 2021. Numerous surveys and studies point to upward of 25% of humans using their phones and devices to the point of addiction. Corroborating these significant numbers, an informal empirical observation shows that people of all ages are glued to their smartphones: on public transport, in restaurants, parks, gyms, at family dinners and even at school or work. But what makes online activities so much more engaging than real life?
The current economic model creates fierce competition among online businesses for people’s attention. Attention means data and money for businesses. However, attention also means time and both time and attention are limited resources. Thus, to be competitive within the market, online platforms have developed strategies to attract and hold users’ attention, resulting in a scarcity of attention and, consequently, less time for users to devote to other meaningful activities.
I call these strategies Hyper-engaging Mechanisms (HEMs). HEMs consist of a complex toolbox that exploits users’ cognitive vulnerabilities, eroding their rationality and self-control and shaping their behaviour according to the business’ interest. These mechanisms usually involve adaptive algorithms, built on machine learning and personal data processing, that personalise content and design; strategies that impact the allocation of users’ attentional resources; a method aimed at reinforcing behaviour by interacting with users’ dopaminergic system; and interface design strategies that take advantage of users’ desires and natural cues.
HEMs potentially increase the time and frequency with which users access online platforms. These effects, together with the neurological impacts of these mechanisms, are directly associated with internet addiction and other disorders. Furthermore, HEMs manipulate users, covertly directing their behaviour to what is more profitable for the businesses. Their method is manipulation because it exploits cognitive vulnerabilities to guide users’ decisions and shape their behaviour. Accordingly, it is neither persuasion (a direct appeal to choose), nor coercion (the restriction of acceptable options). Since all humans are exposed to cognitive vulnerabilities, anyone can be affected by this kind of online manipulation (at most, in different degrees).
HEMs are worrying and, arguably, morally reproachable. But are they also unlawful? This blogpost summarises the answer to this question based on my Master’s Thesis “HYPER-ENGAGING MECHANISMS AND THE EU LEGAL FRAMEWORK: an analysis of the lawfulness of practices that increase the time and frequency of use of online platforms” submitted to NOVA School of Law in partial fulfilment of the requirements for obtaining the Master’s Degree in Business Law and Technology. In the thesis I analyse three EU legal instruments that could potentially protect users against HEMs: the Unfair Commercial Practices Directive (UCPD), the Digital Services Act (DSA) and the proposal for the Artificial Intelligence Act (AI Act).
Unfair Commercial Practices Directive
Although the UCPD provisions were developed before the emergence of many technologies and online practices, they are flexible and broad enough to cover HEMs. The UCPD prohibits all unfair commercial practices (Art. 5), particularly if they are ‘misleading’ (Arts. 6 and 7), ‘aggressive’ (Arts. 8 and 9), or included in the blacklist (Annex I). Except for the blacklisted practices, for a practice to be judged unfair it must cause or be likely to cause the ‘average consumer’ to take a ‘transactional decision’ that they would not otherwise have taken.
Considering that all humans, including the ‘average consumer’, have cognitive vulnerabilities that can be exploited through manipulation, HEMs are likely to significantly impair any user’s ability to make a free and informed decision, thereby causing them to use a platform more often and for longer periods than planned.
According to Art. 8 UCPD, a commercial practice is ‘aggressive’ if: (i) it involves harassment and/or coercion and/or undue influence; (ii) it significantly impairs or is likely to significantly impair the average consumer’s freedom of choice; and (iii) it causes or is likely to cause the average consumer to take a transactional decision they would not otherwise have taken. HEMs harm users’ autonomy, impairing the average consumer’s freedom of choice and conduct
‘Undue influence’, pursuant to the UCPD, occurs when the trader exploits its position of power to put pressure on the consumer, and such pressure significantly limits the consumer’s ability to make an informed decision. HEMs are deployed in a context where online platform providers are in a position of power in relation to users due to the digital asymmetry. These mechanisms take advantage of such power asymmetry to manipulate users by exploiting their cognitive vulnerabilities, which is a form of psychological pressure. This manipulation is likely to significantly limit users’ ability to make a rational and informed decision because, even if they received the necessary information, their ability to act rationally is compromised and they would not be able to circumvent these practices anyway. Therefore, HEMs constitute a form of undue influence and deploying them is prohibited as it constitutes an aggressive commercial practice.
Art. 5(2) UCPD establishes that a commercial practice is unfair if it is contrary to the requirements of professional diligence and materially distorts or is likely to materially distort the economic behaviour of the average consumer. Considering the functioning of HEMs and their consequences for users, it is reasonable to say that when a trader deploys them, it is not acting with the required professional diligence. Also, the exploitation of users’ cognitive vulnerabilities through HEMs can shape (and distort) users’ behaviour. These effects can potentially affect anyone, including the average consumer. Therefore, these practices are also prohibited under Art. 5(2) UCPD.
Digital Services Act and Artificial Intelligence Act
Unlike the UCPD, the DSA was developed with a focus on online practices and technological advancements. Indeed, Art. 25(1) DSA directly tackles HEMs. This Article prohibits online platforms from designing their online interfaces in a way that manipulates or otherwise materially distorts or impairs users’ ability to make free and informed decisions, which is precisely what these mechanisms do. However, as Art. 25(2) DSA establishes that this prohibition shall not apply to practices already covered by the UCPD, by the letter of the law and considering that the UCPD already prohibits these practices, the DSA will (surprisingly) not apply to HEMs.
The AI Act is still a work in progress and may change significantly before its enactment. Notwithstanding this, the current version prohibits AI systems that deploy subliminal techniques beyond a person’s consciousness to materially distort a person’s behaviour in a manner that is likely to cause that person physical or psychological harm (Art. 5(1)(a)). Machine learning, which is an AI system, is a fundamental part of HEMs. Thus, in line with the current proposal for the AI Act, the use of AI to build HEMs will be prohibited as these mechanisms manipulate users by distorting their behaviour. Since the AI Act does not clarify what it means with ‘psychological harm’, it remains to be seen how high the bar will be set to satisfy this requirement.
Despite the existence of a robust EU legal framework, thanks in particular to the UCPD, some adjustments may be necessary to better respond to HEMs. Some tests required by the UCPD are not the perfect fit for online manipulative practices. In view of that, the inclusion of these practices in Annex I UCPD is an option. Also, it is noticeable that there is a lack of enforcement of the Directive’s provisions regarding online manipulation, which will potentially be overcome by the recent introduction of Art. 11(a) UCPD. This Article provides individual remedies for consumers who can now claim compensation, which may encourage them to report unfair practices more frequently.
The DSA and the AI Act demonstrate that the EU legislator is aware and trying to do something about HEMs and online manipulation more generally. Nevertheless, the DSA, which could be the perfect Regulation to ban these practices, unreasonably creates an exception for practices already covered by the UCPD.
The Information Age represents a big step in humanity’s history. It has boosted efficiency, opened new business opportunities, and facilitated trading. It has also contributed to societies by accelerating the sharing of knowledge and facilitating access to information. The competition for consumers’ attention goes further back than the attention economy. As such, the use of attention and data is not the issue per se, but the limitless strategies that have been applied in the race for market and informational power, which have negative consequences for users’ wellbeing and autonomy. This Insight did not aim to cover all the complexities of HEMs nor provide definitive legal solutions, but rather to call everyone’s attention to this problem, demonstrate its relevance, and contribute to the discussion on the role of EU law in protecting humans’ autonomy and health in the fast-developing attention economy. Hopefully these goals have been achieved.