METAVERSE: THE FUTURE OF ADVERTISING UNDER THE EU LEGAL FRAMEWORK

Introduction: the numerous opportunities for advertising in the metaverse

Metaverse: is it the future of digital marketing? The metaverse can be defined as an immersive 3D virtual space where users can interact with each other with different aims including social networking, playing online games, working, and learning. Even though its widespread adoption is likely to take years, leading brands are already exploring the marketing possibilities it presents, such as on-platform product placement, direct-to-avatar sales and sponsorship opportunities for sporting and music events. A successful example is Nikeland, the metaverse space created by Nike using the Roblox platform that has already attracted 7 million visitors. It enables users to purchase exclusive Nike digital goods, which can be worn by their avatar and displayed within their personal section of Nikeland.

Nonetheless, as the metaverse creates great opportunities for businesses to advertise and promote their products and services, it also raises significant challenges for data protection and children’s safety.

Privacy concerns and the conflicts with the General Data Protection Regulation (GDPR) Behavioural advertising is a form of targeted advertising that requires the processing of personal data to customise ads, products and services, targeting the audience. This practice is likely to become an issue in the metaverse, as it allows the analysis, through complex algorithms, of users’ emotional responses to infer their desires, preferences and even the prices they would be willing to pay for a product or service. In other words, it enables companies to offer them products based on their behaviours and reactions.

This form of advertising not only creates trends and desires that did not previously exist but may also directly contradict the GDPR, as, under Article 22, the data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal or similar effects for him or her.

A recent decision of the Irish Data Protection Commission (DPC) imposed a fine on Meta of a total of 390 million euros for illegally forcing Facebook and Instagram users to accept personalised ads. Even though this situation did not occur in the metaverse specifically, as the GDPR could also apply to it, one could easily imagine a similar case where metaverse providers decide to make users’ consent to personalised ads a mandatory condition for accessing the metaverse.

In this context, to create an ad that is effective but still compliant with the GDPR, advertisers must develop non-intrusive ads that provide enough information to enable users to make informed decisions.

Influencer-generated content

Companies must be careful with blurring commercial and non-commercial content, meaning that, in the metaverse, brands must differentiate between entertainment and advertising content. Influence marketing is a popular brand strategy to target advertise, as influencers exercise great influence on online platforms over fashion and culture, shaping consumer behaviour. Virtual influencers such as Lil Miquela and Shudu are already working with some of the world’s biggest brands. As brands begin employing avatar-based influencers to engage in activities that collect personal data from metaverse users, it becomes necessary to adopt strict and clear privacy standards to protect users’ rights.

In terms of regulation, the EU framework on commercial practices and media rules for influencers and advertisers is expected to also be applied in the metaverse. Influencers must identify paid partnerships and the brands for which they are working, according to the e-Commerce Directive (Art. 6) and the Audiovisual Media Services Directive (AVMSD). In addition, even though the proposed Digital Services Act (DSA) does not specifically address the metaverse or any kind of virtual reality, it is possible that it will apply to providers operating in the metaverse. The DSA regards influencers as content creators and attributes them more accountability for the content they post online, in that they must ensure that the public can clearly identify their content as advertising. At the same time

social media platforms will be expected to monitor and remove inappropriate or illegal content and could even suspend accounts.

Impacts on children

It is essential to make any content in the metaverse (advertisements included) age appropriate. Particularly because children can have access to and engage with advertisements not directly targeted to them. For example, when a website initially designed for adults has an audience composed of children, the operator is liable for ensuring that the advertising shown is appropriate for this audience. Moreover, the operator must pay attention to the possibility of its advertising content being harmful to children’s health or well-being.

Children are in a position of vulnerability due to their lower sense of reality and accountability. For this reason, worries arise that the metaverse may not be a safe environment for children given that they might be exposed to bullying, abuse, harassment, racism, and pornographic content. Therefore, it is fundamental to keep the content age appropriate. Even though the dangers present in the metaverse are similar to those found in any other virtual space, due to the metaverse’s heightened realism their impact may be more profound. Thus, the experience of these abuses in the metaverse has the potential to be even more traumatic than in other online formats. 

Regardless of how well-intentioned, safety policies will be hard to monitor and enforce in the metaverse. Particularly because exposure to harassment, hate speech and bullying can happen in a blink of an eye in the metaverse. Therefore, safety must be a prime concern when designing a metaverse system, and tools such as parental controls could also be helpful for parents to monitor their children’s activities and restrict the content accessed by them. Some virtual reality (VR) headsets already include these tools. Furthermore, access of minors to the metaverse can be limited by adopting, at least in the EU, the proposed European Digital Identity Wallet, with which EU citizens and residents will be able to access online services with their national digital identification, allowing a zero-knowledge proof of the users’ age.

Other potential impacts of the Digital Services Act The DSA imposes a ban on targeted advertising based on special categories of data (such as gender, ethnicity, sexual orientation, religion, political views, and genetic or biometric data). Another interesting feature of this regulation is the obligation of advertising

transparency, which requires that meaningful information is provided to explain why a user was targeted with a specific ad.

The DSA will also promote the removal of illegal online content, which includes hate speech, sexual harassment, terrorist content or any other illegal content (which the DSA does not define, leaving this determination to the discretion of each Member State). Notice and action mechanisms will provide better protection for victims of online abuse.

Furthermore, the EU DSA introduces mandatory annual risk assessments to mitigate measures concerning any systemic risks, such as the dissemination of illegal content and negative effects on users’ fundamental rights like privacy, freedom of expression and information.

The EU DSA will apply to all intermediary providers on 17 February 2024. All the new obligations introduced by this regulation would be meaningless without proper enforcement, which is crucial to guarantee consumers’ and children’s rights in a digital environment, such as the metaverse. Unsurprisingly, infringements of the DSA could result in large fines of up to 6% of a provider’s total worldwide turnover. Users may also be entitled to receive compensation for damage or loss suffered due to a provider’s lack of compliance with its obligations under the DSA.

Conclusion

The metaverse comes along with numerous advertising possibilities, but also with legal risks. For the time being, specific new regulations targeting the metaverse are not available, since the metaverse is still evolving and the current EU legal framework is also applicable. Nonetheless, it should be noted that the European  Commission is expected to discuss virtual worlds’ future and metaverse regulation in May 2023. Lastly, considering all the risks involved in the metaverse and the vulnerability inherent to children, companies must design metaverse systems (both hardware and software) with children’s safety as a priority. Similarly, policymakers should work to prevent metaverse providers from manipulating children online by enforcing baseline privacy requirements.

The Insights published herein reproduce the work carried out for this purpose by the author and therefore maintain the original language in which they were written. The opinions expressed within the article are solely the author’s and do not reflect in any way the opinions and beliefs of WhatNext.Law or of its affiliates. See our Terms of Use for more information.

Leave a Comment

We'd love to hear from you

We’re open to new ideas and suggestions. If you have an idea that you’d like to share with us, use the button bellow.