Digital Extended Reality

In short

Digital Extended Reality technologies combine advanced computing systems (hardware and software) that can change how people connect with each other and their surroundings and influence or manipulate human actions through interactions with virtual environments.

Key ethical concerns surround cybersecurity and how these technologies may impact human behavioural and social dynamics. For example, technology mimicking human responses may give rise to responses as though it were actually human, while developments in Extended Reality may lead to undue influence from ‘nudging’ techniques.

More about Digital Extended Reality

Digital Extended Reality could change how people connect with each other and their surroundings in physical and virtual settings.

We include two many technologies in this family: Extended Reality (XR), which relates to virtual and simulated experiences using digital technologies, and Natural Language Processing (NLP), which allows computer systems to process and analyse a vast quantity of human natural language information (e.g., voice, text, images) and generate text in natural or artificial languages. These two technologies can stand alone or be combined in certain devices. You can explore specific examples that fall in these categories below.

Potential ethical repercussions of such technologies include cognitive and physiological impacts as well as behavioural and social dynamics, such as influencing users’ behaviours, and monitoring and supervising people.

  • XR: Virtual Reality

    A virtual reality (VR) environment is completely simulated by digital means for its user. Currently, simulating VR focuses on visual aspects, but other senses are also being incorporated into these experiences.
  • XR: Augmented Reality

    Augmented Reality (AR) combines elements of real and virtual environments instead of trying to achieve complete immersion in virtual reality. Users can see the real world, with virtual objects superimposed upon or combined with the real environment.
  • XR: Avatars and the metaverse

    A metaverse emphasizes the social element of XR: multiple users can interact in one virtual or augmented enviroment. Avatars usually represent real people (or at least an animated version of them) and can be customised to some extent according to users’ preferences.
  • XR: Digital Twins

    These are digital replicas of physical objects that can possess dynamic features like the synchronisation of data between the physical twin and the digital twin to monitor, simulate, and optimize the physical object.
  • NLP: Text generation and analysis

    Learning procedures applied on big datasets of original text have allowed large language models (LLMs) to generate text at a level close to humans. In addition, techniques can analyse language content for its sentiments or opinions, understanding how the general public or a specific group feel about issues, events or topics.
  • NLP: Chatbots

    Conversational agents, or chatbots, use NLP to interact with users, orally or in writing. They already provide a wide array of services in customer support or with voice assistants.
  • NLP: Affective Computing

    Through subtle psychological strategies in dialogue, such as prioritising certain topics or directing the conversation in a direction, a chatbot can influence what another person thinks or believes. Ultimately, this can nudge the user to change their behaviour without forcing them, which is known as nudging.

    Ethical analysis

    ‘Ethics by design’ is at the core of TechEthos. It was necessary to identify the broad array of values and principles at stake in Digital Extended Reality, to be able to include them from the very beginning of the process of research and development. Based on our ethical analysis, we will propose how to enhance or adjust existing ethical codes, guidelines or frameworks.

    XR and NLP are treated as self-sufficient, standalone technologies in the analysis below, but our in-depth reports also look at the ethical issues raised by their combination.

    Core ethical dilemmas in XR

    In there a preference for material reality?

    The emergence of virtual reality prompts the question of whether virtual experiences mediated via XR are equivalent to experiences gained in the real world: do they evoke similar emotions, behaviours or judgements?

    Mode of being of virtual objects

    Digital objects are the types of things we experience in the digital world, like “an image” or “a video”. However, it is not clear how they can be individual objects if all they consist of is digital data. The philosophy of digital objecthood features several position. A moderate one is that digital objects exist insofar as they are experienced and conceptualised by a digital mind. A more radical position claims that virtual objects and environments are of the same nature as material objects and environments.

    Value of virtual objects

    If a distinction between virtual objects and material objects is kept, consequences of actions in material reality certainly do not equal the consequences of actions in virtual reality. For example, driving fast in virtual reality does not imply the same risk as driving fast on a material road.

    Nevertheless, some scholars have argued that virtual objects do retain some ethical value, not because of the equivalent consequences involved, but because values or behaviour patterns formed in XR can influence behaviours in the real world, for example in speeding on a road in one’s actual car, with negative consequences.

    More core ethical dilemmas are tackled in the ‘Analysis of Ethical Issues’ report.

    Read the report

    Applications and use cases in XR

    Training: knowledge transfer and qualia

    One of the most established applications of XR is in training skills. The areas in which XR training applications are the most impactful usually include high-risk or costly material training conditions, such as traing for pilots and surgeons. Are skills acquired via virtual experiences equivalent or transferable to material conditions?

    Remote work: long-term effects on workers and the job market

    XR work environments are available on the market and allow coworkers to host meetings and interact at a distance, sometimes using avatars. The ethical challenges associated with its use include the potential overuse of this always-accessible mode of work, impact on local job markets, and the collection of workers’ data, among others.

    More applications and use cases are tackled in the ‘Analysis of Ethical Issues’ report.

    Read the report

    Core ethical dilemmas in NLP

    NLP systems lack human reasoning

    Today most chatbots are deterministic models without machine learning. They take the user down a decision tree in a predetermined way. However, the most advanced NLP techniques, capable of varied conversation on many topics with nearly human-level outputs, rely on statistical linguistic analysis. They do not involve any understanding of meaning or semantics. Void of intention and disconnected from action and responsibility, they cannot be considered on a par with language produced by human speakers. However, humans might take the chatbot’s language to be meaningful and react to its semantic content.

    Artificial emotions influence human users

    Some applications use conversational agents to influence their users through the architecture or language of the dialogue. Manipulation by a conversational agent can be direct (including inaccurate or skewed information) or indirect, using the “nudging” strategies.

    More core ethical dilemmas are tacked in the ‘Analysis of Ethical Issues’ report.

    Read the report

    Applications and use cases in NLP

    Human resources: gender bias, data protection and labour market

    Chatbots are used by human resources managers for recruitment as well as for career follow-up and employee training. The training data used have been found to be biased, especially against margionalised populations. This can tlead to different types of harm, in terms of how these populations are represented and what resources or opportunities, such as jobs, are allocated to them.

    Creativity: authenticity

    NLP can be used to generate seemingly creative or poetic text that has no human creative input or that relies on prior creative work. If such applications were used at scale, it might reduce the profitability of creative or innovative work.

    More applications and use cases are tackled in the ‘Analysis of Ethical Issues’ report.

    Read the report

    Legal analysis

    While no international or EU law directly addresses or explicitly mentions Digital Extended Reality, many aspects are subject to international and EU law. Below, you can explore the legal frameworks and issues relevant to this technology family and read about the next steps in our legal analysis.

    Next steps in our legal analysis

    TechEthos has analysed so far the obligations of States (for international law) and/or Member States (for EU law) and the rights of private individuals under those laws.

    The obligations of private individuals and entities will be the focus on a report on the legal frameworks at the national level (forthcoming Winter 2022).

    The work of these two reports, and the gaps and challenges in existing legal frameworks identified by this work, will form the basis for legal and policy recommendations in the TechEthos project in the coming months (forthcoming Spring 2023).

    Societal analysis

    This type of analysis is helping us bring on board the concerns of different groups of actors and look at technologies from different perspectives.

    Expert perspectives

    TechEthos asked researchers, innovators, as well as technology, ethical, legal and economic experts, to consider several future scenarios for our selected technologies and provide their feedback regarding attitudes, proposals and solutions.

    Read the policy note
    Societal perspectives

    A series of events such as science cafés and workshops with local research and technology players will ask the public about the attitudes, values and concerns triggered by future scenarios for our selected technologies.

    Events kicked off in the Summer 2022; discover the ‘science café’ format in our news article.

    Read the article
    Media discourse

    Media discourse on technologies both reflects and shapes public perceptions. As such, it is a powerful indicator of societal awareness and acceptance of these technologies. TechEthos carried out an analysis of the news stories published in 2020 and 2021 on our three technology families in 13 EU and non-EU countries (Austria, Czech Republic, France, Germany, Ireland, Italy, Netherlands, Romania, Serbia, Spain, Sweden, UK, and USA). This used state-of-the-art computational tools to collect, clean and analyse the data.

    A noteworthy finding related to digital extended reality is that this family of technologies is primarily discussed with reference to virtual reality. Indeed, the term is mentioned in almost 42% of the stories collected for this family of technologies. On the contrary, natural language processing (NLP) is rarely mentioned. This suggests that the general public might have more awareness of virtual reality than with NLP techniques. This finding is also of interest to TechEthos public engagement activities, stressing the need for more effort to raise public awareness of NLP. Keywords related to Ethical, Legal, and Social Issues (ELSI) were mentioned in 35% of the overall news stories collected for digital extended reality, with terms ‘society’, ‘security’ and ‘privacy’ being the most frequently mentioned ELSI topics.

    Read the report
    go to top

    Let's keep in touch
    Subscribe for news about our work and resources

    eu flag

    TechEthos has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement no. 101006249. This website and its contents reflect only their authors' view. The Research Executive Agency and the European Commission are not responsible for any use that may be made of the information contained herein.