1Recent moves to promote the metaverse environment have raised new concerns about the lack of accountability and poor oversight of the gathering and use of personal data by platform providers. While it could be argued that the metaverse is an aspiration rather than actuality, Facebook’s decision in 2021 to change its name to ‘Meta Inc. has generated a lot of interest and discussion (Meta Inc., 2021).
2The term ‘Metaverse’ is thought to have originated in Neal Stephenson’s novel ‘Snow Crash in which he describes a dystopian world where it becomes difficult to distinguish between the real and the virtual world (Stephenson, 1992). This idea has been carried forward to describe closer integration between the digital world and the real world with the development of interactive and 3D digital environments. It could be argued that there is not one metaverse, but a variety of metaverses operated by different platform providers.
3At Edinburgh Napier University we recently conducted a national survey of metaverse usage to discover the differences in attitudes to among different demographic groups. One unexpected result was the lack of concern about safety and privacy among users. There was no significant difference between age groups, nor between men and women. The population of metaverse users is small relative to the population of social media users (Szaniawska-Schiavo, 2022). This means that the body of experience and number of instances of abuse are relatively small. Despite the limited reporting of abuse in the metaverse, concerns arise in several areas, which I will consider in turn (Sebastian, 2023).
4Platform providers in the metaverse have the potential to gather very large volumes of detailed personal data. At a basic level this includes records of the sites visited. A richer level of data includes interactions with other users and with the environments offered by the platforms (Uberti, 2022).
5Then there are the headsets, gaming consoles, and haptic devices used to interact with the metaverse environment. They can gather information on eye movements, body posture as well as behavioural characteristics such as gait, and biometric data such as iris patterns, retinal images and facial features. Devices linked by Bluetooth or insecure Wi-Fi connections could also be hacked (Di Pietro & Cresci, 2021; Wang et al., 2022).
6Privacy may be invaded by other users. For instance, it is possible to eavesdrop on conversations in the metaverse by posing as an inanimate object such as a fire hydrant.
7In another scenario, an individual visiting a virtual storefront (Shein, 2022) may be giving away a lot of personal data. For instance, the retailer could detect any branded items being worn and may have the ability to edit or supress the rendering of competitors’ products while the individual is in store. This raises the question of personal autonomy and whether changing someone’s appearance is an abuse of their identity and personal integrity. What would happen if the same principle was applied to exclude individuals on the basis of perceived gender, race or age? There might, for instance, be strong arguments for limiting access to content on the basis of whether it is age appropriate. These measures would depend on being to verify the ‘true’ identity of users.
8There is ongoing concern about accountability and association of a presence in the metaverse with a verified identity. This is open to exploitation by authoritarian regimes or by large corporations (Zuboff, 2019). Platform providers can be put under pressure to reveal the identities of protesters. Another perspective is that giving up some privacy may be the price to pay for prevention of terrorist attacks. Commentators such as Solove (2011) have made a strong case for the preservation of privacy, suggesting that mass surveillance is not a particularly effective prevention tool. This is primarily because of the large volume of resulting data that has to be sifted through. There is also the argument that bypassing privacy protection undermines the very values upon which a free and open society is based.
9When we talk about identity, which identity are we talking about? Your avatar may have a distinct presence that is unlike your day-to-day appearance. That is almost certainly the case in a gaming environment, where an avatar is likely to have additional powers. Is it an idealisation of reality or an alternative expression of your personality? Is this private? Should you be required to associate an avatar with your physical identity? What if someone interferes with your identity? They might steal your appearance or hijack your avatar. Impersonation could be used to embarrass or disrupt your activity. The resulting harm to the individual could include loss of autonomy, psychological harm from bullying or loss of property, money and digital assets.
10Anonymity can give licence to abusive behaviour. Trolling and extremist views often lurk behind false or anonymous identities. However, social media sites such as Facebook allow use of authentic names, which is particularly important for individuals who may be transitioning or who do not wish to use cis- gendered labels. With the widespread use of avatars in the metaverse, it is not clear that the same principles would apply on many metaverse platforms. Trust in identity is essential for purchasing products online. Banks need to verify purchasers’ identities to release funds to retailers.
11Ownership in the metaverse will become more lucrative as enterprises seek to monetise interactions (Zhou et al., 2018). Individuals can pay a lot to own ‘real estate’ or to enhance the appearance and capabilities of their avatars. NFT (non-fungible tokens) are a popular way of owning prestige items. What protections are in place against appropriation of these assets? What remedies do individuals have if someone steals virtual properties? Do existing property laws cover this and how are these laws enforced across national boundaries?
12Intellectual property is one aspect of ownership. How do I protect a design, a skin, a capability? Although there are international treaties on copyright, there are still many differences in approach to software, for instance. To some extent this will come under the control of the platform providers and it would be interesting to investigate the protections afforded to asset owners. Facebook and other social media providers’ terms of use meant that any images posted on their platforms can be freely used and exploited by the platform and they are not under any obligation to share any profits or royalties resulting from that use (Who Owns Your Online Photos – Which Computing Helpdesk, n.d.). What about the terms and conditions of other platform providers? What assets do they ‘own’ and what rights do users have?
13There are many well-developed markets for sale of personal data exfiltrated following data breaches (Burkey, 2022). For the most part the exploitation of that data is unsophisticated and thwarted by multi-factor authentication (MFA). The richness of personal data in the metaverse could lead to more sophisticated fraud based on detailed profiles and identity theft. Passing off as an individual could allow access to digital assets that could be traded for crypto, which might itself be part of a money- laundering operation.
14Pyramid selling schemes translate very effectively into the metaverse – creating desirable artefacts or NFTs for sale on the basis that individuals make money from resale. Gaming platforms encourage skilled gamers (often from impoverished backgrounds) to purchase enhanced features in order to sell their expertise to other gamers. The tragedy is that individuals either borrow money or invest their own sparse resources in a scheme that is very volatile and where they could lose all their money. Trading on the aspirations of those who do not have very much becomes a moral issue.
15Anonymity makes abuse all too easy, because abusers believe that they can behave in a way that they would be reluctant to if their identity were known. Sadly, abuse based on gender is all too common on social media (Levey, 2018). Cyberbullying is another manifestation of online abuse. Sometimes it can be difficult to pin down. Intention plays a part, so that highly directed, personalised attacks are often unambiguously abusive. However, a general disparaging remark can also be harmful. The consequences of abuse vary enormously from mild irritation to self-harm or even suicide. The state of the individual and their situation also determine the degree of harm.
16With increasingly sophisticated means to enhance the experience of the metaverse, actions in the metaverse can be translated into physical sensations (such as the weight and feel of an object). Safeguards could be bypassed to cause pain or even injury to an individual.
17We can start to think about some of these ethical issues in terms of regulation. Existing regulation of social media will apply to the metaverse. However, it is likely that additional measures will be needed to protect the identity and autonomy of individuals in the metaverse. Enforcement of regulation and compliance then become governance issues.
18I have tried to outline some of the ethical challenges faced in the metaverse. There has been considerable literature attempting to map the potential problems, and we are now at a stage where user behaviour and attitudes need to be investigation (Di Pietro & Cresci, 2021; Fernandez & Hui, 2022; Sebastian, 2023; Wang et al., 2022; Zhao et al., 2021). A better understanding of the ethical issues faced by users and service providers places us in a better position to start thinking about regulation.