Publications

Metaverse and personal data: beware of today’s future

Publié le 14 March 2023

Metaverse is the contraction of “meta” and “universe”, it gives us another universe in which it becomes possible to escape. According to Webedia, it can be defined as “the meeting place, stable in time, of a community of users wishing to engage in a wide variety of online social activities, in immersion”. Sociability and immersion become the key words.

Today, the metaverse is in the process of acquiring its status in the modern world. Indeed, it already existed in games like SecondLife, The Sims, Animal Crossing, Roblox, and Fortnite, where Travis Scott was able to give a concert on the island of Sweaty Sands. However, Facebook’s decision to rename itself as “Meta” has been a turning point. Mark Zuckerberg sees the metaverse as a godsend, an opportunity to reinvent himself. Thus, no less than $10 billion have been invested from 2021 in the development of “Reality Labs”. An additional 10,000 jobs are planned to be created in Europe within 5 years.

Facebook was right, the uses of the metaverse are constantly growing. It is becoming possible to shop at Walmart, to buy a 36-hectare plot of land for 120 ethereum as claimed by Carrefour, to raise the deador to buy a weapon in NFT to defend oneselfagainst virtual sexual assaults, things that have already been observed. The uses of the metaverse can also have more commercial aims, in particular by adapting it to the professional world: meetings, connected seminars, etc. Indeed, the quarantine period has shown the capa bilities of digital to connect us together via communications: in future, the metaverse will allow us to meet in an increased 3D world. In total, the growth rate of the metaverse is estimated at 39.4% from 2022 to 2030: it has not finished surprising us.

However, these new uses inevitably lead to an extension of the field of personal data collection by private sector actors. Indeed, nearly 40% of American consumers expect to spend between three and four hours a day on their computers, and 40% expect to spend one to two hours. On the other hand, only 7% do not plan to use their computer every day. That’s a lot of data collected per hour of play.

This is where the General Data Protection Regulation (GDPR) comes in. Indeed, thanks to its wide scope, it becomes possible to apply it to this emerging technology that is the metaverse. The obligations of the data controller remain similar, as do the different principles related to data processing that must be applied.

Thanks to virtual reality helmets that transport us into the metaverse, all our biometric data can be collected, such as the color of our skin, our pores, our eyes, which are made up of the iris that contributes to the uniqueness of our person… Even more, behavioral data can be collected when faced with a situation: the increase in our heart rate, the dilation of our pupil, and other things. These data can be used by the operator to deduce all our emotions and to match them with the consequences that it wants. This is emotional marketing.

The other possible application in the metaverse would be geolocation marketing in order to attract people to a particular place and time in the real world. This is the example of a product that would be placed on a shelf at eye level in the Walmart metaverse store, so that the client would be tempted to buy the same product in real life.

These are all uses that can go astray. In this sense, it is important to evaluate the potential place of data protection regulations in the metaverse: virtuality or reality? Indeed, should current legislation be considered inert with regard to such an innovative technology or is it really intended to apply to this new use, thanks to a scope designed to intercept any new technology?

1. The application of GDPR: adapting the regulation to the metaverse

The territorial scope of GDPR is intended to cover both companies established in the European Union and companies collecting data from EU Member States. Similarly, its material scope is also intended to encompass data from the metaverse. Indeed, the definition is sufficiently broad to allow data from the meta-verse to fall under the yoke of this regulation. Thus, given these permissive criteria, GDPR is intended to apply to all processing carried out. Let’s briefly study the compliance points to be checked.

1) Collect metaverse data according to GDPR

When collecting data in the metaverse, it is important that this is done for “specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes” (article 5.1.b). In this sense, the processing is carried out within the meaning of GDPR and it is the responsibility of the designers to limit the purposes of the processing. This ensures that the data is used only for the purposes envisaged for the operation of the meta-verse technology. It limits and prevents the operator, responsible for the processing, from using the data for commercial purposes in particular, if this was not specified beforehand.

In addition, the purposes must be included in the information to be provided at the time of collection (Article 13). Even in the example of a game, the metaverse must be able to provide this information to the person concerned, we could for example consider information mentions at the time of the creation of the account in order to warn of the future of the surname, first name, e-mail, etc. Indeed, this new virtual world is not a “no rights” zone. The processed data is protected and can be protected simply by being established on European territory. Thanks to its extended material and territorial scope, the provisions of GDPR are intended to be considered with regard to the metaverse.

2) Processing metaverse data according to GDPR

The metaverse, although being by definition virtual, almost unreal, cannot be exempted from any constraint resulting from our reality, both on the legal and legislative level. It remains subject to the cardinal principles of lawfulness, loyalty and transparency(article 5.1.c). As such, a client of the metaverse must be able to consult the policy implemented to process these data.

Furthermore, the issue of user consent must not be overlooked. On the one hand, it can be problematic because it could be given by a minor. Indeed, the consentof a minor under 15 years of age is not valid in France: they cannot give consent alone for certain processing. Elsewhere, this majority differs, which could imply the design of a metaverse specific to the national regulations of each country, far from its original idea of “universality”. For a minor, consent must come from the legal guardian. However, in reality, it is not very practical to leave one’s virtual reality helmet, one’s controllers, to give it to an adult capable of giving valid consent. More generally, it is difficult to obtain free, informed, unambiguous, specific and explicit consent from a child or an adult when consent is a condition for access to a game… In fact, this consent may be biased and its validity questioned. In this sense, considering basing data collection on consent does not seem rational: other legal bases should be considered.

However, Facebook, a pioneer of the metaverse, has assured that it wants to build this world in a responsible and reasoned way: (“[…] minimize the amount of data that’s used, build technology to enable privacy-protective data uses and give people transparency and control over their data”). This opportunity to control one’s data is a real challenge, one that is ambitious for Web 3. However, this prerogative still seems difficult to achieve in the metaverse, even though Facebook’s engineers have admitted their inability to know what the data will be used for in the future, or even to locate the data collected. It remains to be seen whether the promise is kept.

2. The application of GDPR to the collection of sensitive data in the metaverse

Of the 1,200 Americans surveyed who were interested in the metaverse, 77% said they were concerned about Facebook holding the data. And rightly so, as the data that can be collected for the metaverse are numerous and some of them identifying. If the unprecedented nature of this technology could have led to new dangers for users, especially by the contribution of unexpected data by texts, it is not. In fact, the data collected can be qualified as sensitive within the meaning of GDPR. As an illustration, The Financial Times inspected the patents filed by Meta and discovered its willingness to use the collection of biometric and behavioral data.

1) Physical biometric data

The metaverse becomes the extension of the self: the home. Similarly, for the avatar, which is built according to my ideal. Thus, to personalize my avatar, it becomes possible to scan my photo to replicate the color of my skin, the movement of my hair and even the smallest pore. In the same way, the VR headset has the ability to capture and photograph the player’s pupil, identifying them personally.

This is the problem with physical biometric data, which is unique and intangible. Our eyes are permanently linked to our person, which constitutes our identity. For these purposes, the manipulation of this data directly compromises us. Thus, are we ready to hand over our identity and our body to private actors, whose security measures and ambitions are sometimes unclear and whose collection threatens to constitute the beginnings of a vast profiling?

In this approach, the existing texts would deserve some clarification. Indeed, these texts advocate the protection of personal data, which should include more explicit guarantees with regard to these “physical” biometric data. These data are already used by both private and public actors: facial recognition for unlocking smartphones, national digital identity cardsthat can use fingerprints, etc. These applications and their use are rightly the subject of debate, the outcome of which represents a real challenge for society, particularly with regard to the development of AI, which is capable of capturing and therefore authenticating any physical person based on these data. This fear of the manipulation of identifying data will continue to grow in a digital world in constant innovation.

2) Behavioral and physiological biometric data

In the same context, helmets and “controllers” also capture all our movements. It becomes possible to analyze the functioning of the human being. For example, we can observe a movement of the eyes that lingers on a product, the increase of the heart rate by sensors, a smile, a frown, etc.

Here again, behaviors allow us to identify or even authenticate a person, in a more subtle way. This presents risks identified by GDPR. In this sense, all these sensitive data could not be collected, in any way whatsoever, except to justify a public interest, express consent or other justifications mentioned in Article 9.2 of GDPR. For Noëlle Martin, we are very close to transhumanism: “The goal is to create 3D replicas of people, places and things, so hyper-realistic and tactile that it is impossible to distinguish them from what is real, and then to serve as an intermediary for a whole series of services, etc.”.

This data can be linked to sensitive data as health data or genetic and biometric data but is not subject to specific regulation. In this sense, a variation of interpretation could allow them to be qualified as simple personal data, implying the possibility of excessive processing. However, the quality and quantity of information that behavioral data can provide remains problematic and threatening as soon as it becomes analyzed and interpreted by a third party. Depending on the emergence of the metaverse, legislation will have to question the relevance of adding specific guarantees to this data.

3. Regulation of the strategies used on the metaverse, application of the regulations related to GDPR

The collection of these biometric data makes sense when we talk about marketing. Indeed, in this case we try to capture the slightest particularity of the person in order to get closer to their tastes and then to influence their towards a purchase.

1) Location-based marketing

The CNIL’s Digital Innovation Laboratory (LINC), a think tank on emerging digital uses, writes that “data is not necessarily desired for its own sake, but because it is the best (or least bad) transcription of players’ activities and behaviors (i.e., where they go, how they move, whether they are attentive/reactive, etc.)”, in the end, location-based marketing will use this information to attract people to a particular place and time. Here, the typical example is the one we’ve already been able to invoke: placing a product in the metaverse to entice people to buy it in real life.

GDPR regulates marketing practices in general. But it is worth considering here the e-Privacy directive transposed by article L34-1 of the French Code of Post and Electronic Communications, which confers a special protection of privacy in the electronic communications sector, unless the data subject consents. Under these texts, data relating to communication traffic is confidential (routing, protocol, etc.). The user’s terminal is protected because it can represent a vector of invasion of privacy. So, could we transpose such protection into the metaverse? Would it be possible to envisage a “confidential” sphere around the individual, protecting them from commercial interests and from any data collection in this sense? The scope of application of this directive remains open. In any case, such a limit has been envisaged as a security “perimeter” around the avatar in order to protect it from virtual aggression.

In addition, location data is protected by the same directive. As a matter of principle, it is not possible to carry out any processing of these data unless anonymization has been carried out or the consent of the data subject has been obtained. In this case, we could just as well imagine a metaverse in our glasses, in the same way as Snapchat’s Spectacles, showing us images of romantic architecture or Google Maps directly indicating the street to take. In this case again, the regulations must apply. These data indicating the geographical position will have to be kept for a limited period of time (CJEU, 21 December 2016, joined cases Tele2 Sverige AB (C 203/15) and Watson and others (C 698/15)), be the subject of information to the data subject, give them the opportunity to withdraw their consent, temporarily prohibit such processing or provide for the limitation of persons having access to these data. However, it also seems possible to exploit emotions for marketing purposes.

2) Emotional marketing

The metaverse has an enormous potential. Indeed, because it allows us to immerse ourselves, it captures our most precise senses: sight, hearing… until we can imagine that it can manipulate our cerebral plasticity. In this respect, virtual reality has been used to divert the attention of patients so that they can manage pain more easily, especially during an operation.

In the same sense, when helmets become equipped with sensors, it becomes possible to measure stress at work or emotional peaks such as anger, depression, anxiety, as already exists in China. Artificial intelligence allows the detection of such emotions, especially by measuring the brain rhythm that indicates its level of activity. The latter seems to be a key player in the metaverse. It will be able to interpret the data in real time to produce exploitable results, notably for marketing purposes. However, these AI applications are problematic, and it is in this sense that they are the subject of future regulation at the European level.

The regulation of artificial intelligence is still in its infancy. However, the European Union is building a draft regulation : Regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts, also known as the “Artificial Intelligence Act”. It prohibits the development of AI systems that employ techniques that manipulate human behavior, result in physical or psychological harm, or exploit the vulnerability of a group of people. Since the metaverse is limited only by the imagination of its designers, they will nevertheless have to take such prohibitions into account. It remains to be seen how much importance they will be able to attach to such regulatory imperatives, which are perceived as restrictive. This is where the inexhaustible question of reconciliation between the economy and the law, perceived as a brake on innovation and design, arises. For the moment, the legal framework is progressively imposing itself in order to fight such drifts in the face of the use of personal data in such purely economic fields. In addition, the project subjects, as a matter of principle, the development of AI systems for emotion recognition and biometric categorization to specific requirements, in addition to the systems qualified by the Commission as “high risk”. Transparency, precision, supervision by a physical person, and information on the machine’s intervention are required. However, these obligations are only indicative until the adoption of this regulation, which is expected within the next 4 years. In the same way, our assumptions on this intrusive metaverse remain conditioned by the choices envisaged by the designers.

In conclusion, the metaverse seems to be able to become a new El Dorado, a space for escape, far from the economic or geopolitical problems that threaten our world and lock us in a dull reality. However, the playful aspect of this technology could turn out to be insidious as private actors like Facebook, the new Meta, would appropriate personal data that are unique to us: our biometric data. This is yet another piece of data to be added to the list of data held by these giants. Thus, this technology represents the challenge of accumulating even more data against us and promoting their hegemonic position. As an example, Oculus, an innovative company at the origins of virtual reality headsets had been bought out as early as 2014 by Facebook. It is therefore easy to understand that the Big Tech companies are becoming the main actors of this hypothetical transition.

They are opposed by regulation. Indeed, as Margarethe VESTAGER, vice-president of the European Commission, wished, the European Union has initiated a process of regulation of digital actors, implicitly targeting Big Tech. In this respect, the legislation in force: GDPR or the ePrivacy Directive, seem to be an effective bulwark against potentially excessive data collection. Moreover, the content published on the metaverse will fall under the yoke of moderation driven by the recent Digital Services Act (DSA). Thus, it is easy to understand that the institutions aim at elaborating timeless texts, aware of the constant delay of the legislation in front of the digital world. A gamble that seems to succeed with regard to the metaverse, constrained by these legislations. At the same time, other metaverses in gestation are considered in the light of the development of an emerging technology. It is becoming possible to include modules with artificial intelligence, which will be subject to the draft regulation on artificial intelligence. This initiative attests to an issue that is still recent and that is gradually being addressed by the law, the contours of which remain abstract.

The existing legislative arsenal allows us to ensure a serene navigation in the metaverse. In practice, and with regard to the legislation in force, the latter is sufficiently broad to provide a general framework for the metaverse. However, deviations are always possible, because where the law may admit limits, there are none in the human mind. Indeed, the application of these texts by design remains conditioned to the will of the original designer. To palliate this margin of interpretation, it might seem relevant to adopt specific legislation for this type of flourishing and potentially dangerous technology.

Faced with the development of the metaverse, it will be a matter of observing the choice of jurisdictions to seize the existing legislative arsenal or to create, jointly with the legislator, a sui generis right, specific to its originality. Faced with all these issues, one question remains: metaverse, a dream of escape or confinement in a prison?

For more information:

– Eléna LOPEZ