Next Article in Journal
Hardware Trojans in Chips: A Survey for Detection and Prevention
Next Article in Special Issue
Enhancing Mouth-Based Emotion Recognition Using Transfer Learning
Previous Article in Journal
Swarm-Intelligence-Centric Routing Algorithm for Wireless Sensor Networks
Previous Article in Special Issue
EEG-Based BCI Emotion Recognition: A Survey
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing

Instituto de Investigación e Innovación en Bioingeniería, Universitat Politècnica de València, 46022 València, Spain
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(18), 5163; https://doi.org/10.3390/s20185163
Submission received: 23 July 2020 / Revised: 7 September 2020 / Accepted: 8 September 2020 / Published: 10 September 2020

Abstract

:
Emotions play a critical role in our daily lives, so the understanding and recognition of emotional responses is crucial for human research. Affective computing research has mostly used non-immersive two-dimensional (2D) images or videos to elicit emotional states. However, immersive virtual reality, which allows researchers to simulate environments in controlled laboratory conditions with high levels of sense of presence and interactivity, is becoming more popular in emotion research. Moreover, its synergy with implicit measurements and machine-learning techniques has the potential to impact transversely in many research areas, opening new opportunities for the scientific community. This paper presents a systematic review of the emotion recognition research undertaken with physiological and behavioural measures using head-mounted displays as elicitation devices. The results highlight the evolution of the field, give a clear perspective using aggregated analysis, reveal the current open issues and provide guidelines for future research.

1. Introduction

Emotions play an essential role in rational decision-making, perception, learning and a variety of other functions that affect both human physiological and psychological status [1]. Therefore, understanding and recognising emotions are very important aspects of human behaviour research. To study human emotions, affective states need to be evoked in laboratory environments, using elicitation methods such as images, audio, videos and, recently, virtual reality (VR). VR has experienced an increase in popularity in recent years in scientific and commercial contexts [2]. Its general applications include gaming, training, education, health and marketing. This increase is based on the development of a new generation of low-cost headsets which has democratised global purchases of head-mounted displays (HMDs) [3]. Nonetheless, VR has been used in research since the 1990s [4]. The scientific interest in VR is due to the fact that it provides simulated experiences that create the sensation of being in the real world [5]. In particular, environmental simulations are representations of physical environments that allow researchers to analyse reactions to common concepts [6]. They are especially important when what they depict cannot be physically represented. VR makes it possible to study these scenarios under controlled laboratory conditions [7]. Moreover, VR allows the time- and cost-effective isolation and modification of variables, unfeasible in real space [8].

1.1. Virtual Reality Set-Ups

The set-ups that display VR simulations have been progressively integrated into studies as the relevant technologies have evolved. These consist of a combination of three objective features, formats, display devices and user interfaces.
The format describes the structure of the information displayed. The most common are two-dimensional (2D) multimedia and three-dimensional (3D) environments, and the main difference between them is their levels of interactivity [9]. 2D multimedia, including 360° panoramic images and videos, provide non-interactive visual representations. The validity of this format has been extensively explored [10]. Moreover, the latest advances in computer-generated images simulate light, texture and atmospheric conditions to such a degree of photorealism that it is possible to produce a virtual image that is indistinguishable, to the naked eye, from a photograph of a real-world scene [11]. This format allows scientists to test static computer-generated environments, with many variations, cheaply and quickly in a laboratory. On the other hand, 3D environments generate interactive representations which allow changes in the user’s point of view, navigation and even interaction with objects and people [12]. Developing realistic 3D environments is more time consuming than developing 360° computer-generated photographs, and their level of realism is limited by the power of the hardware. However, the processing potency of GPUs (graphics processing units) is increasing every year, which will enhance the performance of 3D environments. Moreover, the interaction capacity of 3D environments, which facilitates the simulation of real-world tasks, is a key aspect in the application of virtual reality [2].
The display devices are the technological equipment used to visualise the formats. They are classified according to the level of immersion they provide, that is, the sensorimotor contingencies that they support. These are related to the actions that experimental subjects carry out in the perception process, for example, when they bend down and shift the position of their heads, and their gaze direction, to see underneath an object. Therefore, the sensorimotor contingencies supported by a system define a set of valid actions (e.g., turning the head, bending forward) that carry meaning in terms of perception within the virtual environment [13]. Since immersion is objective, one system is more immersive than another if it is superior in at least one characteristic while others remain equal. There are three categories of immersion system, non-immersive, semi-immersive and immersive [2]. Non-immersive systems are simpler devices which use a single screen, such as a desktop PC, to display environments [14]. Semi-immersive systems, such as the cave automatic virtual environment (CAVE), or the powerwall screen, use large projections to display environments on walls, enveloping the viewer [15,16]. These displays typically provide a stereo image of an environment, using a perspective projection linked to the position of the observer’s head. Immersive devices, such as HMDs, are fully-immersive systems that isolate the user from external world stimuli [17]. These provide a complete simulated experience, including a stereoscopic view, which responds to the user’s head movements. During the last two decades, VR has usually been displayed through desktop PCs or semi-immersive systems, such as CAVEs and powerwalls [18]. However, improvements in the performance and availability of the new generation of HMDs is boosting their use in research [19].
The user interfaces, which are exclusive to 3D environments which allow this level of interaction, are the functional connections between the user and the VR environment which allow him or her to interact with objects and navigate [20]. Regarding interaction with objects, manipulation tasks include: selection, that is, acquiring or identifying an object or subset of objects, positioning, that is, changing an object’s 3D position, and rotation, that is, changing an object’s 3D orientation. In terms of the navigation metaphors in 3D environments, virtual locomotion has been thoroughly analysed [21], and can be classified as physical or artificial. Regarding the physical, there are room-scale-based metaphors, such as real-walking, which allow the user to walk freely inside a limited physical space. These are normally used with HMDs, and position and orientation are determined by the position of the user’s head. They are the most naturalistic of the metaphors, but are highly limited by the physical tracked area [22]. In addition, there are motion-based metaphors, such as walking-in-place or redirected walking. Walking-in-place is a pseudo-naturalistic metaphor where the user performs a virtual locomotion to navigate, for example, by moving his/her hands as if (s)he was walking, or by performing footstep-like movements, while remaining stationary [23]. Redirected walking is a technique where the user perceives (s)he is walking freely but, in fact, is being unknowingly manipulated by the virtual display: this allows navigation in an environment larger than the actual tracked area [24]. Regarding the artificial, controller-based metaphors allow users to control their movements directly through joysticks or similar devices, such as keyboards and trackballs [25]. In addition, teleportation-based metaphors allow the user to point where (s)he wants to go and teleport him or her there with an instantaneous “jump” [26]. Moreover, recent advancements in the latest generation HMD devices have increased the performance of navigation metaphors. Point-and-click teleport metaphors have become mainstream technologies implemented in all low-cost devices. However, other techniques have also increased in performance: walking-in-place metaphors have become more user-friendly and robust, room-scale-based metaphors now have increased coverage areas, provided by low-cost tracking methods, and controller-based locomotion now addresses virtual sickness through effective, dynamic field-of-view adjustments [27].

1.2. Sense of Presence

In addition to the objective features of the set-up, the experience of users in virtual environments can be measured by the concept of presence, understood as the subjective feeling of “being-there” [28]. A high degree of presence creates in the user the sensation of physical presence and the illusion of interacting and reacting as if (s)he was in the real world [29]. In the 2000s, the strong illusion of being in a place, in spite of the sure knowledge that one is not actually there, was characterised as “place illusion” (PI), to avoid any confusion that might be caused by the multiple meanings of the word “presence”. Moreover, just as PI relates to how the world is perceived, and the correlation of movements and concomitant changes in the images that form perceptions, “plausibility illusion” (PsI) relates to what is perceived, in a correlation of external events not directly caused by the participant [13]. PsI is determined by the extent to which a system produces events that directly relate to the participant, and the overall credibility of the scenario being depicted in comparison with viewer expectations, for example, when an experimental participant is provoked into giving a quick, natural and automatic reply to a question posed by an avatar.
Although presence plays a critical role in VR experiences, there is limited understanding of what factors affect presence in virtual environments. However, there is consensus that exteroception and interoception factors affect presence. It has been shown that exteroception factors, such as higher levels of interactivity and immersion, which are directly related to the experimental set-up, provoke increased presence, especially in virtual environments not designed to induce particular emotions [30,31,32]. As to the interoception factors, which are defined by the content displayed, participants will perceive higher presence if they feel emotionally affected; for example, previous studies have found a strong correlation between arousal and presence [33]. Recent research has also analysed presence in specific contexts and suggested that, for example, in social environments, it is enhanced when the VR elicits genuine cognitive, emotional and behavioural responses, and when participants create their own narratives about events [34]. On the other hand, presence decreases when users experience physical problems, such as cybersickness [35].

1.3. Virtual Reality in Human Behaviour Research

VR is, thus, proposed as a powerful tool to simulate complex, real situations and environments, offering researchers unprecedented opportunities to investigate human behaviour in closely controlled designs in controlled laboratory conditions [33]. There are now many researchers in the field, who have published many studies, so a strong, interdisciplinary community exists [2].
Education and training is one field where VR has been much applied. Freina and Ott [36] showed that VR can offer great educational advantages. It can solve time-travel problems, for example, students can experience different historical periods. It can address physical inaccessibility, for example, students can explore the solar system in the first person. It can circumnavigate ethical problems, for example, students can “perform” serious surgery. Surgical training is now one of the most analysed research topics. Interventional surgery lacked satisfactory training methods before the advent of VR, except learning on real patients [37]. Bhagat, Liou and Chang [38] analysed improvements in military training. These authors suggested that cost-effective 3D VR significantly improved subjects learning motivation and outcomes and provided a positive impact on their live-firing achievement scores. In addition, besides enhancements in cost-effectivity, VR offers a safe training environment, as evidenced by the extensive research into driving and flight simulators [39,40]. Moreover, de-Juan-Ripoll et al. [41] proposed that VR is an invaluable tool for assessing risk-taking profiles and to train in related skills, due to its transferability to real-world situations.
Several researchers have also demonstrated the effectiveness of VR in therapeutic applications. It offers some distinct advantages over standard therapies, including precise control over the degree of exposure to the therapeutic scenario, the possibility of tailoring scenarios to individual patients’ needs and even the capacity to provide therapies that might otherwise be impossible [42]. Taking some examples, studies using VR have analysed the improvement in the training in social skills for persons with mental and behavioural disorders, such as phobias [43], schizophrenia [44] and autism [45]. Lloréns, Noé, Colomer and Alcañiz [46] showed that VR-based telerehabilitation interventions promoted the reacquisition of locomotor skills associated with balance, in the same way as in-clinic interventions (both complemented with conventional therapy programmes). Moreover, it has been proposed as a key tool for the diagnosis of neurodevelopmental disorders [47].
In addition, VR has been applied transversally to many fields, such as architecture and marketing. In architecture, VR has been used as a framework within which to test the overall validity of proposed plans and architectural designs, generate alternatives and conceptualise learning, instruction and the design process itself [48]. In marketing, it has been applied in the analysis of consumer behaviour in laboratory-controlled conditions [49] and as a tool to develop emotionally engaging consumer experiences [50].
One of the most important topics in human behaviour research is human emotions, due to the central role that they play in many background processes, such as perception, decision-making, creativity, memory and social interaction [51]. Given the presence that VR provokes in users, it has been suggested as a powerful means of evoking emotions in laboratory environments [8]. In one of the first confirmatory studies into the efficacy of immersive VR as an affective medium, Baños et al. [30] showed that emotion has an impact on presence. Subsequently, many other similar studies showed that VR can evoke emotions, such as anxiety and relaxation [52], positive valence in obese children taking exercise [53], arousal in natural environments, such as parks [54], and different moods in social environments featuring avatars [55].

1.4. The Validity of Virtual Reality

Finally, it is crucial to point out that the usefulness of simulation in human behaviour research has been analysed through the validity concept, that is, the capacity to evoke a response from the user in a simulated environment similar to one that might be evoked by a physical environment [56]. Thus, there is a need to perform direct comparisons between virtual and real environments. Some comparisons have studied the validity of virtual environments by assessing psychological responses [57] and cognitive performance [58]. However, there have been fewer analyses of physiological and behavioural responses [59,60]. Heydarian et al. analysed user performance in office-related activities, for example, reading texts and identifying objects, and found that the participants performed similarly in an immersive virtual environment setting and in a benchmarked physical environment for all of the measured tasks [61]. Chamilothori, Wienold, and Andersen compared subjective perceptions of daylit spaces, and identified no significant differences between the real and virtual environments studied [62]. Kimura et al. analysed orienteering-task performance, where participants in a VR room showed less facility, suggesting that caution must be applied when interpreting the nuances of spatial cue use in virtual environments [63]. Higuera-Trujillo, López-Tarruella, and Llinares analysed psycho-physiological responses, through electrodermal activity (EDA), evoked by real-world and VR scenarios with different immersion levels, and demonstrated correlations in the physiological dynamics between real-world and 3D environments [64]. Marín-Morales et al. analysed the emotional responses evoked in subjects in a real and a virtual museum, and found no self-assessment differences, but did find differences in brain dynamics [65]. Therefore, further research is needed to understand the validity of VR in terms of physiological responses and behavioural performance.

1.5. Implicit Measures and the Neuroscience Approach

Traditionally, most theories of human behaviour research have been based on a model of the human mind that assumes that humans can think about and accurately verbalise their attitudes, emotions and behaviours [66]. Therefore, classical psychological evaluations used self-assessment questionnaires and interviews to quantify subjects’ responses. However, these explicit measures have been demonstrated to be subjective, as stereotype-based expectations can lead to systematically biased behaviour, given that most individuals are motivated to be, or appear to be, nonbiased [67]. The terms used in questionnaires can also be differentially interpreted by respondents, and the outcomes depend on the subjects possessing a wide knowledge of their dispositions, which is not always the case [68].
Recent advances in neuroscience show that most of the brain processes that regulate our emotions, attitudes and behaviours are not conscious. In contrast to explicit processes, humans cannot verbalise these implicit processes [69]. In recent years, growing interest has developed in “looking” inside the brain to seek solutions to problems that have not traditionally been addressed by neuroscience. Thus, neuroscience offers techniques that can recognise implicit measurements not controlled by conscious processes [70]. These developments have provoked the emergence in the last decades of a new field called neuroeconomics, which blends psychology, neuroscience and economics into models of decision-making, rewards, risks and uncertainties [71]. Neuroeconomics addresses human behaviour research, in particular the brain mechanisms involved in economic decision-making, from the point of view of cognitive neuroscience, using implicit measures.
Several implicit measuring techniques have been proposed in recent years. Some examples of their applications in human behaviour research are: heart rate variability (HRV) has been correlated with arousal changes in vehicle drivers when detecting critical points on a route [72], electrodermal activity (EDA) has been used to measure stress caused by cognitive load in the workplace [73], electroencephalogram (EEG) has been used to assess engagement in audio-visual content [74], functional magnetic resonance imaging (fMRI) has been used to record the brain activity of participants engaged in social vs. mechanical/analytic tasks [75], functional near-infrared spectroscopy (fNIRS) has been used as a direct measure of brain activity related to decision-making processes in approach-avoidance theories [76], eye-tracking (ET) has been used to measure subconscious brain processes that show correlations with information processing in risky decisions [77], facial expression analysis (FEA) has been applied to detect emotional responses in e-learning environments [78] and speech emotion recognition (SER) has been used to detect depressive disorders [79]. Table 1 gives an overview of the implicit measuring techniques that have been used in human behaviour research.
In addition, recent studies have highlighted the potential of virtual reality environments for enhancing ecological validity in the clinical, affective and social neurosciences. These studies have usually involved the use of simple, static stimuli which lack many of the potentially important aspects of real-world activities and interactions [90]. Therefore, VR could play an important role in the future of neuroeconomics by providing a more ecological framework within which to develop experimental studies with implicit measures.

1.6. Affective Computing and Emotion Recognition Systems

Affective computing, which analyses human responses using implicit measures, has developed into an important field of study in the last decades. Introduced by Rosalind Picard in 1997, it proposed the automatic quantification and recognition of human emotions as an interdisciplinary field based on psychophysiology, computer science, biomedical engineering and artificial intelligence [1]. The automatic recognition of human emotion statements using implicit measures can be transversally applied to all human behaviour topics and complement classic explicit measures. In particular, it can be applied to neuroeconomic research as they share the same neuroscientific approach of using implicit measures, and due to the important relationship that has been found between emotions and decision-making [71]. Emotion recognition models can be divided into three approaches: emotional modelling, emotion classification and emotion elicitation.
The emotional modelling approach can be divided into the discrete and the dimensional. Discrete models characterise the emotion system as a set of basic emotions, which includes anger, disgust, fear, joy, sadness and surprise, and the complex emotions that result from combining them [91]. On the other hand, dimensional models propose that emotional responses can be modelled in a multidimensional space where each dimension represents a fundamental property common to all emotions. The most commonly used theory is the circumplex model of affect (CMA), which proposes a three-dimensional space consisting of: valence, that is, the degree to which an emotion is perceived as positive or negative, arousal, that is, the intensity of the emotion in terms of activation, from low to high, and dominance, which ranges from feelings of total lack of control or influence on events and surroundings to the opposite extreme of feeling influential and in control [92].
Affective computing uses biometric signals and machine-learning algorithms to classify emotions automatically. Many signals have been used, such as voice, face, neuroimaging and physiological [93]. It is noteworthy that one of the main emotion classification topics uses variables associated with central nervous system (CNS) and autonomic nervous system (ANS) dynamics [93]. First, human emotional processing and perception involve cerebral cortex activity, which allows the automatic classification of emotions using the CNS. EEG is one of the techniques most used in this context [94]. Second, many emotion recognition studies have used the ANS to analyse the changes in cardiovascular dynamics provoked by mood changes, where HRV and EDA are the most used techniques [95]. The combination of physiological features and machine-learning algorithms, such as in support vector machines, linear discriminant analysis, K-nearest neighbour and neural networks, has achieved high levels of accuracy in inferring subjects’ emotional states [96].
Finally, emotion elicitation is the ability to reliably and ethically elicit affective states. This elicitation is a critical factor in the development of systems that can detect, interpret and adapt to human affect [97]. The many methods that elicit emotions in laboratories can be mainly divided into two groups, active and passive. Active methods involve directly influencing subjects, including behavioural manipulation [98], social interaction [99] and dyadic interaction [100]. Passive methods usually present external stimuli, such as images, sound or video. As to the use of images, the International Affective Picture System (IAPS) is among the databases most used as an elicitation tool in emotion recognition methodologies [95]. This includes over a thousand depictions of people, objects and events, standardised on the basis of valence and arousal [97]. As to audio, the International Affective Digitalised Sound System (IADS) database is the most commonly applied in studies which use sound to elicit emotions [101]. However, some studies directly use music or narrative to elicit emotions [102]. With respect to audio-visual stimuli, many studies have used film to induce arousal and valence [103]. These emotion elicitation methods have two important limitations. The set-ups used, mostly screens, are non-immersive devices, which provoke only a low level of presence in subjects [30]. Therefore, the stimuli do not evoke in the subjects a feeling of “being there”, which is needed to analyse emotions in simulated real-world situations. In addition, the stimuli are non-interactive, so they do not allow the subjects to intervene in the scene, which would open the possibility to recognise emotional states during interactive tasks. These limitations can be overcome by using immersive VR as a new emotion elicitation method. Since the year 2000, VR has increasingly been used as affective stimulation, however the majority of the studies undertaken have applied classic statistical methods, such as hypotheses testing and correlation, to analyse subjects’ physiological responses to different emotions [104]. However, in recent years, some research has started to apply affective computing paradigms with VR as the emotion elicitation method, combining implicit measures with machine-learning methods to develop automatic emotion recognition models [105].
This paper provides a systematic review of the literature on the use of head-mounted displays in implicit measure-based emotion recognition research, and examines the evolution of the research field, the emotions analysed, the implicit techniques, the data analysis, the set-ups and the validations performed.

2. Materials and Methods

Data Collection

We followed an adapted version of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) study selection guidelines [106]. This includes steps to identify literature, to screen the identified literature, to check the eligibility of the screened literature and, finally, to synthesise the literature. The screening and eligibility steps were performed simultaneously. The literature search was carried out on 25 March 2020. The Scopus database was queried using the following search string: TITLE-ABS-KEY (“virtual reality” OR “head-mounted display”) AND TITLE-ABS-KEY (“emotion*” OR “affective*”) AND DOCTYPE (ar OR re). The keywords virtual reality OR head-mounted display include all the studies on VR and, in particular, all that used HMDs. In addition, the keywords emotion* OR affective* include all the papers related to emotion. The combination of both requirements revealed the research that included virtual reality and emotions. The search was limited to articles in journals and reviews (for snowballing). A total of 1424 records were identified. Some 14 additional records were identified from other sources.
The screening and eligibility checks were undertaken as follows: (1) first, by investigating titles and abstracts, 13 duplicates were identified. (2) The manuscripts were superficially screened for a thematic match with virtual reality as emotion elicitation. A total of 1157 records were excluded for not matching with the topic, and 3 records because they were inaccessible. (3) We investigated 265 records to exclude those that did not fit, using a specific rejection order: that is, if they used HMDs, we moved on to the next filter criterion, implicit measures, if they used implicit measures, we moved on to the last criterion, the analysis of an emotion. Some 132 records were rejected for not using HMDs, 68 for not using implicit measures and 23 for not analysing an emotional dimension. Finally, 42 studies were included in the analysis which used virtual reality displayed in an HMD, in combination with any implicit measure to analyse or recognise emotional states. The summary of the procedure is depicted in Figure 1.

3. Results

3.1. Summary of Previous Research

In recent years, studies have applied implicit measures to analyse emotions using immersive VR with HMDs. Table 2 provides a summary of the studies included in the analysis.

3.2. Evolution of the Research

Figure 2 shows the number of papers published each year which included the topics virtual reality and emotion analysis. This number of studies was calculated based on all the papers screened. In the 1990s, the average number of papers published annually was 6.4, the first being published in 1995. In the 2000s, the average number of papers published increased to 26.3. However, from 2010 to 2014, the average multiplied by three to 77.4. In the last five years, the curve has grown exponentially to 203 in 2019, and a predicted 278 in 2020.

3.3. Emotions Analysed

Figure 3 depicts the evolution in the number of papers analysed in the review based on the emotion under analysis. Until 2015, the majority of the papers analysed arousal-related emotions, mostly arousal, anxiety and stress. From that year, some experiments started to analyse valence- related emotions, such as valence, joy, pleasantness and sadness, but the analysis of arousal-related emotions still predominated. Some 50% of the studies used CMA (arousal 38.1% [54] and valence 11.9% [125]), and the other 50% used basic or complex emotions (stress 23.8% [112], anxiety 16.7% [109], fear 11.9% [43], awe 2.4% [121], calmness 2.4% [135], joy 2.4% [135], pleasantness 2.4% [64] and sadness 2.4% [135]).

3.4. Implicit Technique, Features used and Participants

Figure 4 shows the evolution of the number of papers analysed in terms of the implicit measures used. The majority used HRV (73.8%) and EDA (59.5%). Therefore, the majority of the studies used ANS to analyse emotions. However, most of the studies that used HRV used very few features from the time domain, such as HR [115,120]. Very few studies used features from the frequency domain, such as HF, LF or HF/LF [119,126] and 2 used non-linear features, such as entropy and Poincare [65,105]. Of the studies that used EDA, the majority used total skin conductance (SC) [116], but some used tonic (SCL) [54] or phasic activity (SCR) [124]. In recent years, EEG use has increased, with 6 papers being published (14.3%), and the CNS has started to be used, in combination with HMDs, to recognise emotions. The analyses that have been used are ERP [138], power spectral density [140] and functional connectivity [65]. EMG (11.9%) and RSP (9.5) were also used, mostly in combination with HRV. Other implicit measures used were eye-tracking, gait patterns, navigation and salivary cortisol responses. The average number of participants used in the various studies depended on the signal, that is, 75.34 (σ = 73.57) for EDA, 68.58 (σ = 68.35) for HRV and 33.67 (σ = 21.80) for EEG.

3.5. Data Analysis

Figure 5 shows the evolution of the number of papers published in terms of the data analysis performed. The vast majority analysed the implicit responses of the subjects in different emotional states using hypothesis testing (83.33%), correlations (14.29) or linear regression (4.76%). However, in recent years, we have seen the introduction of applied supervised machine-learning algorithms (11.90%), such as SVM [105], Random Forest [139] and kNN [140] to perform automatic emotion recognition models. They have been used in combination with EEG [65], HRV [105] and EDA [140].

3.6. VR Set-Ups Used: HMDs and Formats

Figure 6 shows the evolution of the number of papers published based on HMD used. In the first years of the 2010s, eMagin was the most used. In more recent years, advances in HMD technologies have positioned HTC Vive as the most used (19.05%). In terms of formats, 3D environments are the most used [138] (85.71%), with 360° panoramas following far behind [142] (16.67%). One research used both formats [64].

3.7. Validation of VR

Table 3 shows the percentage of the papers that presented analyses of the validation of VR in an emotional research. Some 83.33% of the papers did not present any type of validation. Three papers included direct comparisons of results between VR environments and the physical world [64,65,109], and 3 compared, in terms of the formats used, the emotional reactions evoked in 3D VRs, photos [109], 360° panoramas [64] and augmented reality [129]. Finally, another compared the influence of immersion [121], the similarity of VR results with previous datasets [108] and one compared its results with a previous version of the study performed in the real world [132].

4. Discussion

This work highlights the evolution of the use of immersive VR, in particular using head-mounted displays, in emotion recognition research in combination with implicit measures. It provides a clear perspective based on a systematic review and aggregated analysis, focusing on the role that VR might play as an emotion elicitation tool in the coming years.
The evolution of scientific interest in VR and emotions has grown exponentially, to more than 200 papers per year (Figure 2). In particular, the performance improvements in the last few years in the latest generation of HMDs, in terms of resolution, field of view, immersion levels and the fall in their price, has boosted their use in emotion-related research. This accords with VR’s increased application in recent years in other areas, such as rehabilitation, neurosurgery and therapy [2]. Therefore, the results suggest that the 2010s was the decade of the rapid growth of VR in emotion research using implicit measures, and the 2020s might be the decade when the field matures. Environmental simulations might, in the future, normally go beyond the paradigm of non-immersive/video-based 2D images to immersive VR scenarios, where subjects feel a very strong sense of presence and can interact with the stimuli presented.
In regard to HMDs and implicit measures in emotion analysis, there is no consensus about the use of CMA [92] or the Ekman theory of basic emotions [91], since both approaches are used in 50% of the research (Figure 3). The differences in the frameworks used causes some difficulties in comparing the results of different studies. The majority of the studies (90.5%) included analyses of arousal [54], or high-arousal-related discrete emotions, such as stress [112], anxiety [109] and fear [43]. On the other hand, only 23.9% of the studies analysed valence, or discrete emotions closely related to valence, such as awe [121], calm [135], joy [135], pleasantness [64] and sadness [135]. Therefore, although the whole sub-field of affective computing using HMDs is still in its first growth phase, valence recognition and its physiological dynamics, in particular, are under-researched. Recent research since 2017 has started to address this [65,139]. Dominance, a dimension of the CMA still not addressed in general affective computing research using pictures or videos [143], has also not been analysed in HMD set-up research. However, fear, a basic emotion closely related to the dominance dimension, was analysed in 11.9% of the studies examined in the review. In contrast to the fear that is felt when someone watches a horror film, which is based on the empathy of the viewer with the protagonist, the level of presence that immersive VR offers allows the analysis of fear directly felt by subjects based on scenarios they are viewing. Therefore, VR can boost the analysis of the dominance dimension in affective computing in the future. In addition, VR allows researchers to analyse emotional reactions to social stimuli, such as avatars [138], which might be the next stage in the application of classic 2D affective computing paradigms to simulated real-world situations, which can provide new insights with a social dimension.
In terms of the implicit techniques used to recognise emotions evoked through HMDs, ANS measurements are most used: specifically, HRV (73.8%) and EDA (59.5%), many times used in combination. However, until 2016, the majority of the papers featured only HR and SC (Table 2), sometimes in combination with EMG and RSP. From 2016, the research started to include HRV frequency domain and non-linear domain analyses [105,119], and EDA analyses, such as CDA, dividing the signals into tonic and phasic components [64]. In terms of the CNS, EEG research has been undertaken since 2016, including ERP [138], power spectral density [140] and functional connectivity analysis [65]. Other non-physiological implicit measures have been used since 2019, such as eye-tracking [141], gait patterns [135], navigation [133] and salivary cortisol responses [132]. The use of behavioural measures, such as eye-tracking, gait patterns and navigation, might be a very powerful approach where VR can contribute to affective computing research, as they provide high levels of interactivity with the simulated stimuli. This might open a new sub-field where emotional states can be assessed through behavioural measures in interactive, real situations.
However, the current weakest point of HMD-based emotion recognition systems is that only 11.90% of the studies, that is, four, used machine-learning algorithms to classify the emotions analysed. Since the early 2000s, when physiological signals, in combination with HMDs, were first applied to analyse emotions, until 2018, all studies used hypothesis testing and/or correlations to provide insights into the ANS oscillations produced during different affective states, except Reference [125], which used EEG. Although the classic statistical techniques obtained important and useful insights, they have some limitations: (i) hypothesis testing analyses differences between two populations based on means and deviations, but does not provide emotion recognition, (ii) it is difficult to analyse the effect of the combination of several features in datasets with large sets of variables and (iii) they do not take into account non-linear relationships. These limitations are being overcome with the use of machine-learning algorithms, as they can recognise emotions through the development of algorithms in classification problems, automatic feature selection procedures to recognise complex patterns inside data and offer non-linear kernels [143]. Marín-Morales et al. [105] presented the first emotion recognition system using SVM in combination with a large set of HRV features (time, frequency and non-linear domains) and EEG (PSD and mean phase coherence) in 360° emotional rooms, achieving a recognition rate of 75% in arousal and 71.21% in valence. Marín-Morales et al. [65] developed an emotion recognition system in a realistic 3D virtual museum, using SVM in combination with HRV and EEG, with rates of 75% and 71.08% of recognition in arousal and valence, respectively. Granato et al. [139] presented an arousal-valence emotion recognition model with subjects playing a VR racing game. This procedure collected physiological responses, that is, EDA, HRV, EMG and RSP. Bălan et al. [140] analysed the performance of a set of machine-learning and deep-learning techniques (kNN, SVM, RF, LDA, NN), which adapted their stimuli based on the level of fear recognised, in fear recognition in a 3D acrophobia game. The results showed recognition levels ranging from 42.5% to 89.5%. Therefore, the development of emotion recognition models in immersive VR is an open, fast-growing sub-field, which is moving from the classic statistical testing paradigm to supervised machine-learning.
As to the set-ups employed, Figure 6 shows the evolution of the HMDs used in implicit measure-based emotion research. Among the first-generation VR HMDs of the 2000s was VFX3D, which offers a resolution of 380 × 337 per eye. In the 2010s, the eMaginZ800 improved on the resolution of previous HMDs, offering 800 × 600 and 40° of field of view, followed by Oculus Rift DK2, which increased the resolution to 1080 × 960 and, in particular, the FOV to 90°. Finally, in the late 2010s, the HTC Vive offered an increase in resolution to 1600 × 1400 per eye, and democratised VR with its competitive price. Those increments in HMD performance are aligned with the exponential growth of the number of papers that have used HMD in emotion recognition research (Figure 2), and future HMDs, that might achieve 4K of resolution per eye, could boost the use of VR as a tool to recreate real situations in controlled laboratory environments.
The format most used overall was the 3D environment (85.71%)—360° panoramas were used in 16.67% of cases. This is probably due to the fact that 3D environments present a high level of interactivity, as 360° panoramas do not allow changes in point of view. However, both formats can be useful, depending on the aim of the experiment. The 360° panorama set-ups can be very effective for updating classic, closely controlled affective computing methodologies, in particular, when presenting users with a series of non-interactive stimuli, such as IAPS [95] and IADS [144], but increasing degrees of presence based on immersion level [30]. However, there is still a need to develop large datasets of validated immersive stimuli that cover a wide range of emotions, which could be used as general benchmarks to analyse physiological and behavioural dynamics in immersive VR. The 360° approach offers a good solution to this, as the interaction, for example, navigation, provokes uncontrolled variations during the emotional experience. The first dataset of stimuli published was by Marín-Morales et al. [105], which included 4 scenarios that recreated all quadrants of the CMA. On the other hand, the level of interactivity that 3D scenarios offer can be very useful in applied research, since they display more naturalistic and interactive environments, facilitating decision-making research and the analysis of daily situations. Taking some examples, Takac et al. [137] analysed the anxiety felt by speakers when faced by large audiences, Lin et al. [133] analysed the stress felt by individuals when in a building on fire scenario and Kisker et al. [130] analysed arousal in an exposure to a high height.
Immersive VR can be a very powerful tool to analyse human behaviour in controlled laboratory conditions, but we do not yet know the level of VR validity needed to allow the extrapolation to the real world of the insights gained in terms of physiological and behavioural responses. Indeed, 83.33% of the papers did not present any validation, and only 3 provided a direct comparison between the VR scene and the physical environment simulated. Gorini et al. [109] analysed anxiety through HRV and EDA with virtual and real food, Higuera-Trujillo et al. [64] analysed pleasantness through EDA responses in a 3D, 360° and real retail store, and Marín-Morales et al. [65] analysed arousal and valence oscillations with HRV and EEG in a virtual and physical museum. Other research analysed the influence of immersion [121] and other VR features. Thus, VR validation is still an open topic that needs to be more actively addressed. Understanding and isolating the intrinsic dynamics of VR will be key in future years for the validation of the insights obtained using HMDs.
Finally, the results suggest that VR will play a central role in the affective computing field. The research performed has increased its complexity and maturity during the last two decades, and this tendency is likely to continue during the next years. First, future research should extend the analysis of the physiological dynamics using VR as emotion elicitation in VR, to achieve a level of understanding at least as high as we have today using 2D pictures as stimulation. Subsequently, VR might open up many research opportunities that would be very difficult to assess with non-immersive stimuli. In particular, the inclusion of the dominance dimension, which is very closely related to the users’ control of the environment, and impacts on very important features, such as sense of security. Moreover, the social dimension is a crucial factor in the understanding of the emotional dynamics of human beings. The future inclusion of responsive, realistic avatars will help increase the understanding of emotions evoked during social interactions, and the associated physiological responses, in controlled conditions.

5. Conclusions

This work analysed the current state-of-the-art in implicit measure-based emotion recognition elicited by HMDs, and gave a perspective using a systematic and aggregated analysis that can guide future research. After two decades of little research analysing emotions using HMDs in combination with implicit measures, mostly undertaken through the physiological arousal responses of the ANS, in recent years, an inflexion point has been reached. The number of papers published is increasing exponentially, and more emotions are being analysed, including valence-related states, more complex biomedical signal processing procedures are increasingly being performed, including EEG analyses and other behavioural measures, and machine-learning algorithms are being newly applied to develop automatic emotion recognition systems. The results suggest that VR might revolutionise emotion elicitation methods in laboratory environments in the next decade, and impact on affective computing research, transversely in many areas, opening new opportunities for the scientific community. However, more research is needed to increase the understanding of emotion dynamics in immersive VR and, in particular, its validity in performing direct comparisons between simulated and real environments.

Author Contributions

Conceptualisation, J.M.-M.; methodology, J.M.-M.; formal analysis, J.M.-M.; investigation, J.M.-M.; writing—original draft preparation, J.M.-M.; writing—review and editing, J.M.-M., C.L., J.G. and M.A.; visualisation, J.M.-M.; supervision, C.L., J.G. and M.A.; project administration, J.M.-M.; funding acquisition, J.G. and M.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by European Commission, grant number H2020-825585 HELIOS.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
  2. Cipresso, P.; Chicchi, I.A.; Alcañiz, M.; Riva, G. The Past, Present, and Future of Virtual and Augmented Reality Research: A network and cluster analysis of the literature. Front. Psychol. 2018, 9, 2086. [Google Scholar] [CrossRef] [Green Version]
  3. Castelvecchi, D. Low-cost headsets boost virtual reality’s lab appeal. Nature 2016, 533, 153–154. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Slater, M.; Usoh, M. Body centred interaction in immersive virtual environments. Artif. Life Virtual Real. 1994, 1, 125–148. [Google Scholar]
  5. Giglioli, I.A.C.; Pravettoni, G.; Martín, D.L.S.; Parra, E.; Alcañiz, M. A novel integrating virtual reality approach for the assessment of the attachment behavioral system. Front. Psychol. 2017, 8, 1–7. [Google Scholar] [CrossRef] [PubMed]
  6. Kwartler, M. Visualization in support of public participation. In Visualization in Landscape and Environmental Planning: Technology and Applications; Bishop, I., Lange, E., Eds.; Taylor & Francis: London, UK, 2005; pp. 251–260. [Google Scholar]
  7. Vince, J. Introduction to Virtual Reality; Media, Springer: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
  8. Alcañiz, M.; Baños, R.; Botella, C.; Rey, B. The EMMA Project: Emotions as a Determinant of Presence. PsychNology J. 2003, 1, 141–150. [Google Scholar]
  9. Mengoni, M.; Germani, M.; Peruzzini, M. Benchmarking of virtual reality performance in mechanics education. Int. J. Interact. Des. Manuf. 2011, 5, 103–117. [Google Scholar] [CrossRef]
  10. Stamps, A.E., III. Use of photographs to simulate environments: A meta-analysis. Percept. Mot. Ski. 1990, 71, 907–913. [Google Scholar] [CrossRef]
  11. Morinaga, A.; Hara, K.; Inoue, K.; Urahama, K. Classification between natural and graphics images based on generalized Gaussian distributions. Inf. Process. Lett. 2018, 138, 31–34. [Google Scholar] [CrossRef]
  12. Siriaraya, P.; Ang, C.S. The Social Interaction Experiences of Older People in a 3D Virtual Environment. In Perspectives on Human-Computer Interaction Research with Older People; Sayago, S., Ed.; Springer: Cham, Switzerland, 2019; pp. 101–117. ISBN 978-3-030-06076-3. [Google Scholar]
  13. Slater, M. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 3549–3557. [Google Scholar] [CrossRef] [Green Version]
  14. Kober, S.E.; Kurzmann, J.; Neuper, C. Cortical correlate of spatial presence in 2D and 3D interactive virtual reality: An EEG study. Int. J. Psychophysiol. 2012, 83, 365–374. [Google Scholar] [CrossRef]
  15. Borrego, A.; Latorre, J.; Llorens, R.; Alcañiz, M.; Noé, E. Feasibility of a walking virtual reality system for rehabilitation: Objective and subjective parameters. J. Neuroeng. Rehabil. 2016, 13, 68. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Clemente, M.; Rodriguez, A.; Rey, B.; Alcañiz, M. Assessment of the influence of navigation control and screen size on the sense of presence in virtual reality using EEG. Expert Syst. Appl. 2014, 41, 1584–1592. [Google Scholar] [CrossRef]
  17. Borrego, A.; Latorre, J.; Alcañiz, M.; Llorens, R. Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games Health J. 2018, 7. [Google Scholar] [CrossRef] [PubMed]
  18. Vecchiato, G.; Jelic, A.; Tieri, G.; Maglione, A.G.; De Matteis, F.; Babiloni, F. Neurophysiological correlates of embodiment and motivational factors during the perception of virtual architectural environments. Cogn. Process. 2015, 16, 425–429. [Google Scholar] [CrossRef] [Green Version]
  19. Jensen, L.; Konradsen, F. A review of the use of virtual reality head-mounted displays in education and training. Educ. Inf. Technol. 2017, 11, 1–15. [Google Scholar] [CrossRef] [Green Version]
  20. Riecke, B.E.; LaViola, J.J., Jr.; Kruijff, E. 3D user interfaces for virtual reality and games: 3D selection, manipulation, and spatial navigation. In Proceedings of the ACM SIGGRAPH 2018 Courses, Vancouver, BC, Canada, 12–16 August 2018; p. 13. [Google Scholar]
  21. Templeman, J.N.; Denbrook, P.S.; Sibert, L.E. Virtual locomotion: Walking in place through virtual environments. Presence 1999, 8, 598–617. [Google Scholar] [CrossRef]
  22. Bozgeyikli, E.; Bozgeyikli, L.; Raij, A.; Katkoori, S.; Alqasemi, R.; Dubey, R. Virtual reality interaction techniques for individuals with autism spectrum disorder: Design considerations and preliminary results. In Proceedings of the International Conference on Human-Computer Interaction, Florence, Italy, 11–15 July 2016; pp. 127–137. [Google Scholar]
  23. Tregillus, S.; Folmer, E. Vr-step: Walking-in-place using inertial sensing for hands free navigation in mobile vr environments. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems; ACM: New York, NY, USA; pp. 1250–1255.
  24. Nescher, T.; Huang, Y.-Y.; Kunz, A. Planning redirection techniques for optimal free walking experience using model predictive control. In Proceedings of the 2014 IEEE Symposium on 3D User Interfaces (3DUI), Minneapolis, MN, USA, 29–30 March 2014; pp. 111–118. [Google Scholar]
  25. Nabiyouni, M.; Saktheeswaran, A.; Bowman, D.A.; Karanth, A. Comparing the performance of natural, semi-natural, and non-natural locomotion techniques in virtual reality. In Proceedings of the 2015 IEEE Symposium on 3D User Interfaces (3DUI), Arles, France, 23–24 March 2015; pp. 3–10. [Google Scholar]
  26. Bozgeyikli, E.; Raij, A.; Katkoori, S.; Dubey, R. Locomotion in virtual reality for individuals with autism spectrum disorder. In Proceedings of the 2016 Symposium on Spatial User Interaction, Tokyo, Japan, 15–16 October 2016; pp. 33–42. [Google Scholar]
  27. Boletsis, C. The New Era of Virtual Reality Locomotion: A Systematic Literature Review of Techniques and a Proposed Typology. Multimodal Technol. Interact. 2017, 1, 24. [Google Scholar] [CrossRef] [Green Version]
  28. Slater, M.; Wilbur, S. A Framework for Immersive Virtual Environments (FIVE): Speculations on the Role of Presence in Virtual Environments. Presence Teleoperators Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
  29. Heeter, C. Being There: The Subjective Experience of Presence. Presence Teleoperators Virtual Environ. 1992, 1, 262–271. [Google Scholar] [CrossRef]
  30. Baños, R.M.; Botella, C.; Alcañiz, M.; Liaño, V.; Guerrero, B.; Rey, B. Immersion and Emotion: Their Impact on the Sense of Presence. CyberPsychol. Behav. 2004, 7, 734–741. [Google Scholar] [CrossRef]
  31. Slater, M.; Usoh, M.; Steed, A. Depth of Presence in virtual environments. Presence Teleoperators Virtual Environ. 1994, 3, 130–144. [Google Scholar] [CrossRef]
  32. Usoh, M.; Arthur, K.; Whitton, M.C.; Bastos, R.; Steed, A.; Slater, M.; Brooks, F.P. Walking > walking-in-place > flying, in virtual environments. In Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques-SIGGRAPH ’99; Waggenspack, W., Ed.; ACM Press/Addison-Wesley Publishing: New York, NY, USA, 1999; pp. 359–364. [Google Scholar]
  33. Diemer, J.; Alpers, G.W.; Peperkorn, H.M.; Shiban, Y.; Mühlberger, A. The impact of perception and presence on emotional reactions: A review of research in virtual reality. Front. Psychol. 2015, 6, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Riches, S.; Elghany, S.; Garety, P.; Rus-Calafell, M.; Valmaggia, L. Factors Affecting Sense of Presence in a Virtual Reality Social Environment: A Qualitative Study. Cyberpsychol. Behav. Soc. Netw. 2019, 22, 288–292. [Google Scholar] [CrossRef] [PubMed]
  35. Kiryu, T.; So, R.H.Y. Sensation of presence and cybersickness in applications of virtual reality for advanced rehabilitation. J. NeuroEng. Rehabil. 2007, 4, 34. [Google Scholar] [CrossRef] [Green Version]
  36. Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. In Proceedings of the International Scientific Conference eLearning and Software for Education, Bucharest, Italy, 23–24 April 2015; Volume 1, p. 133. [Google Scholar]
  37. Alaraj, A.; Lemole, M.G.; Finkle, J.H.; Yudkowsky, R.; Wallace, A.; Luciano, C.; Banerjee, P.P.; Rizzi, S.H.; Charbel, F.T. Virtual reality training in neurosurgery: Review of current status and future applications. Surg. Neurol. Int. 2011, 2, 52. [Google Scholar] [CrossRef] [Green Version]
  38. Bhagat, K.K.; Liou, W.-K.; Chang, C.-Y. A cost-effective interactive 3D virtual reality system applied to military live firing training. Virtual Real. 2016, 20, 127–140. [Google Scholar] [CrossRef]
  39. Yavrucuk, I.; Kubali, E.; Tarimci, O. A low cost flight simulator using virtual reality tools. IEEE Aerosp. Electron. Syst. Mag. 2011, 26, 10–14. [Google Scholar] [CrossRef]
  40. Dols, J.F.; Molina, J.; Camacho, F.J.; Marín-Morales, J.; Pérez-Zuriaga, A.M.; Garcia, A. Design and development of driving simulator scenarios for road validation studies. Transp. Res. Procedia 2016, 18, 289–296. [Google Scholar] [CrossRef]
  41. de-Juan-Ripoll, C.; Soler-Domínguez, J.L.; Guixeres, J.; Contero, M.; Gutiérrez, N.Á.; Alcañiz, M. Virtual reality as a new approach for risk taking assessment. Front. Psychol. 2018, 9, 1–8. [Google Scholar] [CrossRef] [Green Version]
  42. Bohil, C.J.; Alicea, B.; Biocca, F.A. Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 2011, 12, 752–762. [Google Scholar] [CrossRef]
  43. Peperkorn, H.M.; Alpers, G.W.; Mühlberger, A. Triggers of fear: Perceptual cues versus conceptual information in spider phobia. J. Clin. Psychol. 2014, 70, 704–714. [Google Scholar] [CrossRef] [PubMed]
  44. Park, K.-M.; Ku, J.; Choi, S.-H.; Jang, H.-J.; Park, J.-Y.; Kim, S.I.; Kim, J.-J. A virtual reality application in role-plays of social skills training for schizophrenia: A randomized, controlled trial. Psychiatry Res. 2011, 189, 166–172. [Google Scholar] [CrossRef] [PubMed]
  45. Didehbani, N.; Allen, T.; Kandalaft, M.; Krawczyk, D.; Chapman, S. Virtual reality social cognition training for children with high functioning autism. Comput. Hum. Behav. 2016, 62, 703–711. [Google Scholar] [CrossRef] [Green Version]
  46. Lloréns, R.; Noé, E.; Colomer, C.; Alcañiz, M. Effectiveness, usability, and cost-benefit of a virtual reality—Based telerehabilitation program for balance recovery after stroke: A randomized controlled trial. Arch. Phys. Med. Rehabil. 2015, 96, 418–425. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Alcañiz, M.L.; Olmos-Raya, E.; Abad, L. Use of virtual reality for neurodevelopmental disorders. A review of the state of the art and future agenda. Medicina 2019, 79, 77–81. [Google Scholar]
  48. Portman, M.E.; Natapov, A.; Fisher-Gewirtzman, D. To go where no man has gone before: Virtual reality in architecture, landscape architecture and environmental planning. Comput. Environ. Urban Syst. 2015, 54, 376–384. [Google Scholar] [CrossRef]
  49. Bigné, E.; Llinares, C.; Torrecilla, C. Elapsed time on first buying triggers brand choices within a category: A virtual reality-based study. J. Bus. Res. 2015. [Google Scholar] [CrossRef]
  50. Alcañiz, M.; Bigné, E.; Guixeres, J. Virtual Reality in Marketing: A Framework, Review, and Research Agenda. Front. Psychol. 2019, 10, 1–15. [Google Scholar] [CrossRef]
  51. Picard, R.W. Affective Computing: Challenges. Int. J. Hum. Comput. Stud. 2003, 59, 55–64. [Google Scholar] [CrossRef]
  52. Riva, G.; Mantovani, F.; Capideville, C.S.; Preziosa, A.; Morganti, F.; Villani, D.; Gaggioli, A.; Botella, C.; Alcañiz, M. Affective Interactions Using Virtual Reality: The Link between Presence and Emotions. CyberPsychol. Behav. 2007, 10, 45–56. [Google Scholar] [CrossRef]
  53. Guixeres, J.; Saiz, J.; Alcañiz, M.; Cebolla, A.; Escobar, P.; Baños, R.; Botella, C.; Lison, J.F.; Alvarez, J.; Cantero, L.; et al. Effects of virtual reality during exercise in children. J. Univers. Comput. Sci. 2013, 19, 1199–1218. [Google Scholar]
  54. Felnhofer, A.; Kothgassner, O.D.; Schmidt, M.; Heinzle, A.K.; Beutl, L.; Hlavacs, H.; Kryspin-Exner, I. Is virtual reality emotionally arousing? Investigating five emotion inducing virtual park scenarios. Int. J. Hum. Comput. Stud. 2015, 82, 48–56. [Google Scholar] [CrossRef]
  55. Lorenzo, G.; Lledó, A.; Pomares, J.; Roig, R. Design and application of an immersive virtual reality system to enhance emotional skills for children with autism spectrum disorders. Comput. Educ. 2016, 98, 192–205. [Google Scholar] [CrossRef] [Green Version]
  56. Rohrmann, B.; Bishop, I.D. Subjective responses to computer simulations of urban environments. J. Environ. Psychol. 2002, 22, 319–331. [Google Scholar] [CrossRef]
  57. Bishop, I.D.; Rohrmann, B. Subjective responses to simulated and real environments: A comparison. Landsc. Urban Plan. 2003, 65, 261–277. [Google Scholar] [CrossRef]
  58. de Kort, Y.A.W.; Ijsselsteijn, W.A.; Kooijman, J.; Schuurmans, Y. Virtual laboratories: Comparability of real and virtual environments for environmental psychology. Presence Teleoperators Virtual Environ. 2003, 12, 360–373. [Google Scholar] [CrossRef]
  59. Yeom, D.; Choi, J.-H.; Zhu, Y. Investigation of the Physiological Differences between Immersive Virtual Environment and Indoor Enviorment in a Building. Indoor Built Environ. 2017, 1–17. [Google Scholar] [CrossRef]
  60. van der Ham, I.J.; Faber, A.M.; Venselaar, M.; van Kreveld, M.J.; Löffler, M. Ecological validity of virtual environments to assess human navigation ability. Front. Psychol. 2015, 6, 637. [Google Scholar] [CrossRef] [Green Version]
  61. Heydarian, A.; Carneiro, J.P.; Gerber, D.; Becerik-Gerber, B.; Hayes, T.; Wood, W. Immersive virtual environments versus physical built environments: A benchmarking study for building design and user-built environment explorations. Autom. Constr. 2015, 54, 116–126. [Google Scholar] [CrossRef]
  62. Chamilothori, K.; Wienold, J.; Andersen, M. Adequacy of Immersive Virtual Reality for the Perception of Daylit Spaces: Comparison of Real and Virtual Environments. LEUKOS J. Illum. Eng. Soc. N. Am. 2018, 1–24. [Google Scholar] [CrossRef] [Green Version]
  63. Kimura, K.; Reichert, J.F.; Olson, A.; Pouya, O.R.; Wang, X.; Moussavi, Z.; Kelly, D.M. Orientation in Virtual Reality Does Not Fully Measure Up to the Real-World. Sci. Rep. 2017, 7, 6–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  64. Higuera-Trujillo, J.L.; López-Tarruella, J.; Llinares, M.C. Psychological and physiological human responses to simulated and real environments: A comparison between Photographs, 360° Panoramas, and Virtual Reality. Appl. Ergon. 2017, 65, 398–409. [Google Scholar] [CrossRef] [PubMed]
  65. Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Gentili, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Real vs. immersive-virtual emotional experience: Analysis of psycho-physiological patterns in a free exploration of an art museum. PLoS ONE 2019, 14, e0223881. [Google Scholar] [CrossRef] [PubMed]
  66. Brief, A.P. Attitudes in and Around Organizations; Sage: Thousand Oaks, CA, USA, 1998; Volume 9. [Google Scholar]
  67. Payne, B.K. Prejudice and perception: The role of automatic and controlled processes in misperceiving a weapon. J. Pers. Soc. Psychol. 2001, 81, 181. [Google Scholar] [CrossRef] [PubMed]
  68. Schmitt, N. Method bias: The importance of theory and measurement. J. Organ. Behav. 1994, 15, 393–398. [Google Scholar] [CrossRef]
  69. Barsade, S.G.; Ramarajan, L.; Westen, D. Implicit affect in organizations. Res. Organ. Behav. 2009, 29, 135–162. [Google Scholar] [CrossRef]
  70. Lieberman, M.D. Social cognitive neuroscience: A review of core processes. Annu. Rev. Psychol. 2007, 58, 259–289. [Google Scholar] [CrossRef] [Green Version]
  71. Camerer, C.; Loewenstein, G.; Prelec, D. Neuroeconomics: How neuroscience can inform economics. J. Econ. Lit. 2005, 43, 9–64. [Google Scholar] [CrossRef] [Green Version]
  72. Riener, A.; Ferscha, A.; Aly, M. Heart on the road: HRV analysis for monitoring a driver’s affective state. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Essen, Germany, 21–22 September 2009; pp. 99–106. [Google Scholar]
  73. Setz, C.; Arnrich, B.; Schumm, J.; La Marca, R.; Tröster, G.; Ehlert, U. Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 2009, 14, 410–417. [Google Scholar] [CrossRef] [PubMed]
  74. Berka, C.; Levendowski, D.J.; Lumicao, M.N.; Yau, A.; Davis, G.; Zivkovic, V.T.; Olmstead, R.E.; Tremoulet, P.D.; Craven, P.L. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat. Space Environ. Med. 2007, 78, B231–B244. [Google Scholar] [PubMed]
  75. Jack, A.I.; Dawson, A.J.; Begany, K.L.; Leckie, R.L.; Barry, K.P.; Ciccia, A.H.; Snyder, A.Z. fMRI reveals reciprocal inhibition between social and physical cognitive domains. Neuroimage 2013, 66, 385–401. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Ernst, L.H.; Plichta, M.M.; Lutz, E.; Zesewitz, A.K.; Tupak, S.V.; Dresler, T.; Ehlis, A.-C.; Fallgatter, A.J. Prefrontal activation patterns of automatic and regulated approach--avoidance reactions--a functional near-infrared spectroscopy (fNIRS) study. Cortex 2013, 49, 131–142. [Google Scholar] [CrossRef]
  77. Glöckner, A.; Herbold, A.-K. An eye-tracking study on information processing in risky decisions: Evidence for compensatory strategies based on automatic processes. J. Behav. Decis. Mak. 2011, 24, 71–98. [Google Scholar] [CrossRef]
  78. Bahreini, K.; Nadolski, R.; Westera, W. Towards multimodal emotion recognition in e-learning environments. Interact. Learn. Environ. 2016, 24, 590–605. [Google Scholar] [CrossRef]
  79. Huang, K.-Y.; Wu, C.-H.; Su, M.-H.; Kuo, Y.-T. Detecting unipolar and bipolar depressive disorders from elicited speech responses using latent affective structure model. IEEE Trans. Affect. Comput. 2018. [Google Scholar] [CrossRef]
  80. Prokasy, W. Electrodermal Activity in Psychological Research; Elsevier: Amsterdam, The Netherland, 2012. [Google Scholar]
  81. Kim, H.-G.; Cheon, E.-J.; Bai, D.-S.; Lee, Y.H.; Koo, B.-H. Stress and heart rate variability: A meta-analysis and review of the literature. Psychiatry Investig. 2018, 15, 235. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  82. Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef]
  83. Lotte, F.; Bougrain, L.; Cichocki, A.; Clerc, M.; Congedo, M.; Rakotomamonjy, A.; Yger, F. A review of classification algorithms for EEG-based brain--computer interfaces: A 10 year update. J. Neural Eng. 2018, 15, 31005. [Google Scholar] [CrossRef] [Green Version]
  84. Gruzelier, J.H. EEG-neurofeedback for optimising performance. I: A review of cognitive and affective outcome in healthy participants. Neurosci. Biobehav. Rev. 2014, 44, 124–141. [Google Scholar] [CrossRef]
  85. Thibault, R.T.; MacPherson, A.; Lifshitz, M.; Roth, R.R.; Raz, A. Neurofeedback with fMRI: A critical systematic review. Neuroimage 2018, 172, 786–807. [Google Scholar] [CrossRef] [Green Version]
  86. Naseer, N.; Hong, K.-S. fNIRS-based brain-computer interfaces: A review. Front. Hum. Neurosci. 2015, 9, 3. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  87. Meißner, M.; Oll, J. The promise of eye-tracking methodology in organizational research: A taxonomy, review, and future avenues. Organ. Res. Methods 2019, 22, 590–617. [Google Scholar] [CrossRef]
  88. Calvo, M.G.; Nummenmaa, L. Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cogn. Emot. 2016, 30, 1081–1106. [Google Scholar] [CrossRef] [PubMed]
  89. Schuller, B.W. Speech emotion recognition: Two decades in a nutshell, benchmarks, and ongoing trends. Commun. ACM 2018, 61, 90–99. [Google Scholar] [CrossRef]
  90. Parsons, T.D. Virtual Reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 2015, 9, 660. [Google Scholar] [CrossRef] [Green Version]
  91. Ekman, P. Basic Emotions. Handb. Cogn. Emot. 1999, 45–60. [Google Scholar] [CrossRef]
  92. Russell, J.A.; Mehrabian, A. Evidence for a three-factor theory of emotions. J. Res. Pers. 1977, 11, 273–294. [Google Scholar] [CrossRef]
  93. Calvo, R.A.; D’Mello, S. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 2010, 1, 18–37. [Google Scholar] [CrossRef]
  94. Valenza, G.; Greco, A.; Gentili, C.; Lanata, A.; Sebastiani, L.; Menicucci, D.; Gemignani, A.; Scilingo, E.P. Combining electroencephalographic activity and instantaneous heart rate for assessing brain–heart dynamics during visual emotional elicitation in healthy subjects. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2016, 374, 20150176. [Google Scholar] [CrossRef] [Green Version]
  95. Valenza, G.; Lanata, A.; Scilingo, E.P. The role of nonlinear dynamics in affective valence and arousal recognition. IEEE Trans. Affect. Comput. 2012, 3, 237–249. [Google Scholar] [CrossRef]
  96. Zangeneh Soroush, M.; Maghooli, K.; Setarehdan, S.K.; Motie Nasrabadi, A. A Review on EEG Signals Based Emotion Recognition. Int. Clin. Neurosci. J. 2018, 4, 118–129. [Google Scholar] [CrossRef]
  97. Kory Jacqueline, D. Sidney Affect Elicitation for affective Computing. In The Oxford Handbook of Affective Computing; Oxford University Press: New York, NY, USA, 2014; pp. 371–383. [Google Scholar]
  98. Ekman, P. The directed facial action task. In Handbook of Emotion Elicitation and Assessment; Oxford University Press: New York, NY, USA, 2007; pp. 47–53. [Google Scholar]
  99. Harmon-Jones, E.; Amodio, D.M.; Zinner, L.R. Social psychological methods of emotion elicitation. Handb. Emot. Elicitation Assess. 2007, 25, 91–105. [Google Scholar]
  100. Roberts, N.A.; Tsai, J.L.; Coan, J.A. Emotion elicitation using dyadic interaction task. Handb. Emot. Elicitation Assess. 2007, 01, 106–123. [Google Scholar]
  101. Nardelli, M.; Valenza, G.; Greco, A.; Lanata, A.; Scilingo, E.P. Recognizing emotions induced by affective sounds through heart rate variability. IEEE Trans. Affect. Comput. 2015, 6, 385–394. [Google Scholar] [CrossRef]
  102. Kim, J. Emotion Recognition Using Speech and Physiological Changes. Robust Speech Recognit. Underst. 2007, 29, 265–280. [Google Scholar]
  103. Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos (Extended abstract). In Proceedings of the ACII 2015: International Conference on Affective Computing and Intelligent Interaction, Xi’an, China, 21–24 September 2015; Volume 3, pp. 491–497. [Google Scholar]
  104. Jang, D.P.; Kim, I.Y.; Nam, S.W.; Wiederhold, B.K.; Wiederhold, M.D.; Kim, S.I. Analysis of physiological response to two virtual environments: Driving and flying simulation. CyberPsychol. Behav. 2002, 5, 11–18. [Google Scholar] [CrossRef] [PubMed]
  105. Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Affective computing in virtual reality: Emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 2018, 8, 13657. [Google Scholar] [CrossRef] [PubMed]
  106. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; The PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
  107. Meehan, M.; Razzaque, S.; Insko, B.; Whitton, M.; Brooks, F.P. Review of four studies on the use of physiological reaction as a measure of presence in stressful virtual environments. Appl. Psychophysiol. Biofeedback 2005, 30, 239–258. [Google Scholar] [CrossRef]
  108. Wilhelm, F.H.; Pfaltz, M.C.; Gross, J.J.; Mauss, I.B.; Kim, S.I.; Wiederhold, B.K. Mechanisms of virtual reality exposure therapy: The role of the behavioral activation and behavioral inhibition systems. Appl. Psychophysiol. Biofeedback 2005, 30, 271–284. [Google Scholar] [CrossRef] [Green Version]
  109. Gorini, A.; Griez, E.; Petrova, A.; Riva, G. Assessment of the emotional responses produced by exposure to real food, virtual food and photographs of food in patients affected by eating disorders. Ann. Gen. Psychiatry 2010, 9, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  110. Philipp, M.C.; Storrs, K.R.; Vanman, E.J. Sociality of facial expressions in immersive virtual environments: A facial EMG study. Biol. Psychol. 2012, 91, 17–21. [Google Scholar] [CrossRef] [PubMed]
  111. Parsons, T.D.; Courtney, C.G.; Dawson, M.E. Virtual reality Stroop task for assessment of supervisory attentional processing. J. Clin. Exp. Neuropsychol. 2013, 35, 812–826. [Google Scholar] [CrossRef] [PubMed]
  112. Pallavicini, F.; Cipresso, P.; Raspelli, S.; Grassi, A.; Serino, S.; Vigna, C.; Triberti, S.; Villamira, M.; Gaggioli, A.; Riva, G. Is virtual reality always an effective stressors for exposure treatments? Some insights from a controlled trial. BMC Psychiatry 2013, 13, 52. [Google Scholar] [CrossRef] [Green Version]
  113. Felnhofer, A.; Kothgassner, O.D.; Hetterle, T.; Beutl, L.; Hlavacs, H.; Kryspin-Exner, I. Afraid to be there? Evaluating the relation between presence, self-reported anxiety, and heart rate in a virtual public speaking task. Cyberpsychol. Behav. Soc. Netw. 2014, 17, 310–316. [Google Scholar] [CrossRef]
  114. Hartanto, D.; Kampmann, I.L.; Morina, N.; Emmelkamp, P.G.M.; Neerincx, M.A.; Brinkman, W.-P. Controlling social stress in virtual reality environments. PLoS ONE 2014, 9, e92804. [Google Scholar] [CrossRef]
  115. McCall, C.; Hildebrandt, L.K.; Bornemann, B.; Singer, T. Physiophenomenology in retrospect: Memory reliably reflects physiological arousal during a prior threatening experience. Conscious. Cogn. 2015, 38, 60–70. [Google Scholar] [CrossRef]
  116. Notzon, S.; Deppermann, S.; Fallgatter, A.; Diemer, J.; Kroczek, A.; Domschke, K.; Zwanzger, P.; Ehlis, A.C. Psychophysiological effects of an iTBS modulated virtual reality challenge including participants with spider phobia. Biol. Psychol. 2015, 112, 66–76. [Google Scholar] [CrossRef]
  117. Hildebrandt, L.K.; Mccall, C.; Engen, H.G.; Singer, T. Cognitive flexibility, heart rate variability, and resilience predict fine-grained regulation of arousal during prolonged threat. Psychophysiology 2016, 53, 880–890. [Google Scholar] [CrossRef]
  118. Higuera-Trujillo, J.L.; Marín-Morales, J.; Rojas, J.C.; López-Tarruella-Maldonado, J. Emotional maps: Neuro architecture and design applications. 6th International Forum Design as a Processes. Syst. Des. Beyond Process. Think. 2016, 677–685. [Google Scholar] [CrossRef] [Green Version]
  119. Bian, Y.; Yang, C.; Gao, F.; Li, H.; Zhou, S.; Li, H.; Sun, X.; Meng, X. A framework for physiological indicators of flow in VR games: Construction and preliminary evaluation. Pers. Ubiquitous Comput. 2016, 20, 821–832. [Google Scholar] [CrossRef]
  120. Shiban, Y.; Diemer, J.; Brandl, S.; Zack, R.; Mühlberger, A.; Wüst, S. Trier Social Stress Test in vivo and in virtual reality: Dissociation of response domains. Int. J. Psychophysiol. 2016, 110, 47–55. [Google Scholar] [CrossRef] [PubMed]
  121. Chirico, A.; Cipresso, P.; Yaden, D.B.; Biassoni, F.; Riva, G.; Gaggioli, A. Effectiveness of Immersive Videos in Inducing Awe: An Experimental Study. Sci. Rep. 2017, 7, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  122. Zou, H.; Li, N.; Cao, L. Emotional response—Based approach for assessing the sense of presence of subjects in virtual building evacuation studies. J. Comput. Civ. Eng. 2017, 31, 4017028. [Google Scholar] [CrossRef]
  123. Breuninger, C.; Sláma, D.M.; Krämer, M.; Schmitz, J.; Tuschen-Caffier, B. Psychophysiological reactivity, interoception and emotion regulation in patients with agoraphobia during virtual reality anxiety induction. Cognit. Ther. Res. 2017, 41, 193–205. [Google Scholar] [CrossRef]
  124. van’t Wout, M.; Spofford, C.M.; Unger, W.S.; Sevin, E.B.; Shea, M.T. Skin conductance reactivity to standardized virtual reality combat scenes in veterans with PTSD. Appl. Psychophysiol. Biofeedback 2017, 42, 209–221. [Google Scholar] [CrossRef]
  125. Banaei, M.; Hatami, J.; Yazdanfar, A.; Gramann, K. Walking through Architectural Spaces: The Impact of Interior Forms on Human Brain Dynamics. Front. Hum. Neurosci. 2017, 11, 1–14. [Google Scholar] [CrossRef]
  126. Anderson, A.P.; Mayer, M.D.; Fellows, A.M.; Cowan, D.R.; Hegel, M.T.; Buckey, J.C. Relaxation with immersive natural scenes presented using virtual reality. Aerosp. Med. Hum. Perform. 2017, 88, 520–526. [Google Scholar] [CrossRef]
  127. Chittaro, L.; Sioni, R.; Crescentini, C.; Fabbro, F. Mortality salience in virtual reality experiences and its effects on users’ attitudes towards risk. Int. J. Hum. Comput. Stud. 2017, 101, 10–22. [Google Scholar] [CrossRef]
  128. Biedermann, S.V.; Biedermann, D.G.; Wenzlaff, F.; Kurjak, T.; Nouri, S.; Auer, M.K.; Wiedemann, K.; Briken, P.; Haaker, J.; Lonsdorf, T.B.; et al. An elevated plus-maze in mixed reality for studying human anxiety-related behavior. BMC Biol. 2017, 15, 125. [Google Scholar] [CrossRef]
  129. Tsai, C.-F.; Yeh, S.-C.; Huang, Y.; Wu, Z.; Cui, J.; Zheng, L. The Effect of Augmented Reality and Virtual Reality on Inducing Anxiety for Exposure Therapy: A Comparison Using Heart Rate Variability. J. Healthc. Eng. 2018, 2018, 27–36. [Google Scholar] [CrossRef] [PubMed]
  130. Kisker, J.; Gruber, T.; Schöne, B. Behavioral realism and lifelike psychophysiological responses in virtual reality by the example of a height exposure. Psychol. Res. 2019, 1–14. [Google Scholar] [CrossRef] [PubMed]
  131. Gromer, D.; Reinke, M.; Christner, I.; Pauli, P. Causal Interactive Links Between Presence and Fear in Virtual Reality Height Exposure. Front. Psychol. 2019, 10, 141. [Google Scholar] [CrossRef] [PubMed]
  132. Zimmer, P.; Wu, C. Same same but different? Replicating the real surroundings in a virtual Trier Social Stress Test (TSST-VR) does not enhance presence or the psychophysiological stress response. Physiol. Behav. 2019, 212, 112690. [Google Scholar] [CrossRef]
  133. Lin, J.; Cao, L.; Li, N. Assessing the influence of repeated exposures and mental stress on human wayfinding performance in indoor environments using virtual reality technology. Adv. Eng. Inform. 2019, 39, 53–61. [Google Scholar] [CrossRef]
  134. Schweizer, T.; Renner, F.; Sun, D.; Becker-Asano, C.; Tuschen-Caffier, B. Cognitive processing and regulation modulates analogue trauma symptoms in a Virtual Reality paradigm. Cognit. Ther. Res. 2019, 43, 199–213. [Google Scholar] [CrossRef]
  135. Kim, Y.; Moon, J.; Sung, N.-J.; Hong, M. Correlation between selected gait variables and emotion using virtual reality. J. Ambient Intell. Humaniz. Comput. 2019, 8, 1–8. [Google Scholar] [CrossRef]
  136. Uhm, J.-P.; Lee, H.-W.; Han, J.-W. Creating sense of presence in a virtual reality experience: Impact on neurophysiological arousal and attitude towards a winter sport. Sport Manag. Rev. 2019. [Google Scholar] [CrossRef]
  137. Takac, M.; Collett, J.; Blom, K.J.; Conduit, R.; Rehm, I.; De Foe, A. Public speaking anxiety decreases within repeated virtual reality training sessions. PLoS ONE 2019, 14, e0216288. [Google Scholar] [CrossRef] [Green Version]
  138. Stolz, C.; Endres, D.; Mueller, E.M. Threat-conditioned contexts modulate the late positive potential to faces—A mobile EEG/virtual reality study. Psychophysiology 2019, 56, e13308. [Google Scholar] [CrossRef]
  139. Granato, M.; Gadia, D.; Maggiorini, D.; Ripamonti, L.A. An empirical study of players’ emotions in VR racing games based on a dataset of physiological data. Multimed. Tools Appl. 2020, 1–30. [Google Scholar] [CrossRef]
  140. Bălan, O.; Moise, G.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. An investigation of various machine and deep learning techniques applied in automatic fear level detection and acrophobia virtual therapy. Sensors 2020, 20, 496. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  141. Reichenberger, J.; Pfaller, M.; Mühlberger, A. Gaze Behavior in Social Fear Conditioning: An Eye-Tracking Study in Virtual Reality. Front. Psychol. 2020, 11, 1–12. [Google Scholar] [CrossRef] [PubMed]
  142. Huang, Q.; Yang, M.; Jane, H.-A.; Li, S.; Bauer, N. Trees, grass, or concrete? The effects of different types of environments on stress reduction. Landsc. Urban Plan. 2020, 193, 103654. [Google Scholar] [CrossRef]
  143. Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
  144. Greco, A.; Valenza, G.; Citi, L.; Scilingo, E.P. Arousal and valence recognition of affective sounds based on electrodermal activity. IEEE Sens. J. 2016, 17, 716–725. [Google Scholar] [CrossRef]
Figure 1. Scheme of the PRISMA procedure followed in the review.
Figure 1. Scheme of the PRISMA procedure followed in the review.
Sensors 20 05163 g001
Figure 2. Evolution of the number of papers published each year on the topic of virtual reality and emotions. The total number of papers to be published in 2020 has been extrapolated using data up to 25 March 2020.
Figure 2. Evolution of the number of papers published each year on the topic of virtual reality and emotions. The total number of papers to be published in 2020 has been extrapolated using data up to 25 March 2020.
Sensors 20 05163 g002
Figure 3. Evolution of the number of papers published each year based on emotion analysed.
Figure 3. Evolution of the number of papers published each year based on emotion analysed.
Sensors 20 05163 g003
Figure 4. Evolution of the number of papers published each year based on the implicit measure used.
Figure 4. Evolution of the number of papers published each year based on the implicit measure used.
Sensors 20 05163 g004
Figure 5. Evolution of the number of papers published each year by data analysis method used.
Figure 5. Evolution of the number of papers published each year by data analysis method used.
Sensors 20 05163 g005
Figure 6. Evolution of the number of papers published each year based on head-mounted display (HMD) used.
Figure 6. Evolution of the number of papers published each year based on head-mounted display (HMD) used.
Sensors 20 05163 g006
Table 1. Overview of the main implicit techniques used in human behaviour research.
Table 1. Overview of the main implicit techniques used in human behaviour research.
Implicit TechniqueBiometric Signal MeasuredSensorFeaturesPsychological or Behavioural Construct Inferred
EDA
(electro dermal activity)
Changes in skin conductanceElectrodes attached to fingers, palms or solesSkin conductance response, tonic activity and phasic activityAttention and arousal
[80]
HRV
(heart rate variability)
Variability in heart contraction intervalsElectrodes attached to chest or limbs or optical sensor attached to finger, toe or earlobeTime domain, frequency domain, non-linear domainStress, anxiety, arousal and valence
[81,82]
EEG
(electroencephalogram)
Changes in electrical activity of the brainElectrodes placed on scalpFrequency band power, functional connectivity, event-related potentialsAttention, mental workload, drowsiness, fatigue, arousal and valence
[83,84]
fMRI
(functional magnetic resonance imaging)
Concentrations of oxygenated vs. deoxygenated haemoglobin in the blood vessels of the brainMagnetic resonance signalblood-oxygen-level dependentMotor execution, attention, memory, pain, anxiety, hunger, fear, arousal and valence
[85]
fNIRS
(functional near-infrared spectroscopy)
Concentrations of oxygenated vs. deoxygenated haemoglobin in the bloodNear-infrared light placed on scalpblood-oxygen-level dependentMotor execution, cognitive task (mental arithmetic), decision-making and valence
[86]
ET
(eye-tracking)
Corneal reflection and pupil dilationInfrared cameras point towards eyesEye movements (gaze, fixation, saccades), blinks, pupil dilationVisual attention, engagement, drowsiness and fatigue
[87]
FEA
(facial expression analysis)
Activity of facial musclesCamera points towards face Position and orientation of head. Activation of action unitsBasic emotions, engagement, arousal and valence
[88]
SER
(speech emotion recognition)
VoiceMicrophoneProsodic and spectral featuresStress, basic emotions, arousal and valence
[89]
Table 2. Summary of previous research.
Table 2. Summary of previous research.
NoAuthorEmotionSignalsFeaturesData AnalysisSubjectsHMDVR StimuliStimuli ComparisonDataset Availability
1Jang et al. (2002) [104]ArousalHRV, EDAHR, HRV frequency domain, SCL, STt-test11VFX3D3D flying and driving simulatorNoNo
2Meehan et al. (2005) [107]ArousalHRV, EDAHR, SC, STt-test67Not reported3D training room vs. pit roomNoNo
3Wilhelm et al. (2005) [108] AnxietyHRV, EDAHR, SCANOVA, correlations86Not reported3D height exposurePartially (with a different real dataset)No
4Gorini et al. (2010) [109]AnxietyHRV, EDAHR, SCANOVA30 (20 with food disorders)Not reported3D photo and real food cateringVR vs. photo vs. realNo
5Philipp et al. (2012) [110]ValenceEMGEMGANOVA49Virtual Research V83D room with IAPS pictures projectedNoNo
6Parsons et al. (2013) [111]ArousalHRV, EDAHR, SCANOVA50eMagin Z8003D high-mobility wheeled vehicle with Stroop taskNoNo
7Pallavicini et al. (2013) [112]StressHRV, EMG, RSPHR, SC, RRANOVA39Vuzix VR Bundle3D classroomNoNo
8Peperkorn et al. (2014) [43]FearHRV, EDAHR, SCANOVA96 (48 spider-phobic)eMagin Z8003D virtual lab with time-varying threat (spiders and snakes)NoNo
9Felnhofer et al. (2014) [113]AnxietyHRVHRANOVA75 (30 high anxiety)eMagin Z8003D lecture hallNoNo
10Hartanto et al. (2014) [114]StressHRVHRMANOVA24 healthy subjectseMagin Z8003D stressful social environmentNoNo
11McCall et al. (2015) [115]ArousalHRV, EDAHR, SCCross-correlations306NVIS nVisor SX603D room with time-varying threat (explosions, spiders, gunshots, etc.)NoNo
12Felnhofer et al. (2015) [54]ArousalEDASCLANOVA120Sony HMZ-T1 3D3D park with 5 variations (joy, sadness, boredom, anger and anxiety)NoNo
13Notzon et al. (2015) [116]AnxietyHRV, EDAHR, SCANOVA83 (42 spider-phobic)eMagin Z8003D virtual lab with spidersNoNo
14Hildebrandt et al. (2016) [117]ArousalHRV, EDARMSSD, SCRegression300NVIS nVisor SX603D room with time-varying threats (explosions, spiders, gunshots, etc.)NoNo
15Higuera-Trujillo et al. (2016) [118]StressEDASCRKruskall–Wallis Test and correlations12Oculus Rift DK23D rooms (neutral, stress and calm)NoNo
16Bian et al. (2016) [119]ArousalHRV, EMG, RSPHR, LF, HF, LF/HF, RR, RSRegression36Oculus Rift DK23D Flight simulatorNoNo
17Shiban et al. (2016) [120]StressHRV, EDAHR, SCANOVA45NVIS nVisor SX603D Trier Social Stress TestNoNo
18Chirico et al. (2017) [121]AweHRV, EDA, EMGHF, VLF, SCANOVA42Samsung Gear VR360° neutral and awe videosImmersive vs. non-immersiveNo
19Zou et al. (2017) [122]ArousalHRV, EDAHRV time domain (AVNN, SDNN…) and frequency domain (LF, HF…), SC, SCL, SCRt-test40Oculus Rift DK23D fire evacuationNoNo
20Breuninger et al. (2017) [123]ArousalHRV, EDAHR, HF, SCt-test51 (23 agoraphobics)TriVisio VR Vision3D car accidentNoNo
21van’t Wout et al. (2017) [124]StressEDASCRMANOVA44 veterans (19 with PTSD)eMagin Z8003D combat-related and classroom-relatedNoNo
22Banaei et al. (2017) [125]Arousal, ValenceEEGPSD, ERSPsMANOVA17Samsung Gear VR3D roomsNoNo
23Anderson et al. (2017) [126]StressHRV, EDALF, HF, LF/HF, SCMANOVA18Oculus Rift DK2360° indoor vs. natural panoramasNoNo
24Chittaro et al. (2017) [127]ArousalHRVHR, LF, HF, LF/HFANOVA108Sony HMZ-T1 3D3D cemetery and parkNoNo
25Higuera-Trujillo et al. (2017) [64]PleasantnessHRV, EDAHF, SCRMann–Whitney U tests and correlations100Samsung Gear VR3D, 360° and real retail storereal vs. 3D VR vs. 360° VRNo
26Biedermann et al. (2017) [128]AnxietyHRV, EDA, RSPHR, SC, RRANOVA100HTC ViveMixed reality (3D VR with real-world elements)NoYes
27Tsai et al. (2018) [129]AnxietyHRVHRV time domain (HR, RMSSD…) and frequency domain (HF, LF…) ANOVA30eMagin Z8003D VR claustrophobic environmentsAugmented reality vs. VRUpon request
28Marín-Morales et al. (2018) [105]Arousal, ValenceEEG, HRVPSD and functional connectivity, HRV Time (HR, RMSSD…), frequency (HF, LF…) and non-linear (SD1, SD2, Entropy…) domainSVM60Samsung Gear VR360° virtual roomsNoUpon request
29Kisker et al. (2019) [130]ArousalHRVHRt-test, correlations and regressions30HTC Vive3D exposure to a high heightNoNo
30Gromer et al. (2019) [131] FearHRV, EDAHR, SCANOVA49 (height-fearful)HTC Vive3D forestNoYes
31Zimmer et al. (2019) [132]StressHRV, salivaryHR, salivary cortisol responses, salivary alpha amylaseANOVA50Oculus Rift DK23D Trier Social Stress TestReplication of a real studyNo
32Lin et al. (2019) [133]StressEDA, NavigationSC, travel distance, travel timeMann–Whitney U60HTC Vive3D, building on fireNoNo
33Schweizer et al. (2019) [134]StressHRV, EDAHR, SCt-test and correlations80TriVisio VR Vision3D neutral and trauma-related sceneNoNo
34Kim et al. (2019) [135]Calm, sadness and joyGait PatternsStep count, gait speed, foot plantar pressure ANOVA12HTC Vive360° emotion-related videosNoNo
35Uhm et at. (2019) [136]ArousalEEGPSDMANOVA28Samsung Gear VR360° sport videosNoNo
36Takac et al. (2019) [137]AnxietyHRVHRANOVA19Oculus Rift3D rooms with public audienceNoNo
37Marín-Morales et al. (2019) [65]Arousal, ValenceHRV, EEGPSD and functional connectivity, HRV Time (HR, RMSSD…), frequency (HF, LF…) and non-linear (SD1, SD2, Entropy…) domainSVM60HTC Vive3D art museumReal museum vs. 3D museumUpon request
38Stolz et al. (2019) [138]FearEEGERPsANOVA29Oculus Rift3D room with angry avatarsNoNo
39Granato et al. (2020) [139]Arousal, ValenceHRV, EDA, EMG, RSPHR, SC, SCL, SCR, EMG, RRSVM, RF, Gradient Boosting, Gaussian Process Regression33Oculus Rift DK23D video gamesNoYes
40Bălan et al. (2020) [140]FearHRV, EDA, EEGHR, SC, PSDkNN, SVM, RF, LDA, NN8HTC Vive3D acrophobia gameNoNo
41Reichenberger et al. (2020) [141]FearEye-trackingFixation counts, TTFFANOVA, t-test53 (26 socially anxious)HTC Vive3D room with angry avatarsNoUpon request
42Huang et al. (2020) [142]StressEDASCLMANOVA89Oculus Rift DK2360° built vs. natural environmentsNoYes
Signals: electroencephalograph (EEG), heart rate variability (HRV), electrodermalactivity (EDA), respiration (RSP) and electromyography (EMG). Features: heart rate (HR), high frequency (HF), low frequency (LF), LF/HF (low/high frequency ratio), very low frequency (VLF), total skin conductance (SC), skin conductance tonic level (SCL), fast varying phasic activity (SCR), skin temperature (ST), respiratory rate (RR), respiratory depth (RS), power spectral density (PSD), event-related spectral perturbations (ERSPs), event-related potencials (ERPs) and time to first fixation (TTFF). Data analysis: support vector machines (SVM), k-nearest neighbors algorithm (kNN), random forest (RF), linear discriminant analysis (LDA) and neural networks (NN).
Table 3. Previous research that included analyses of the validation of virtual reality (VR).
Table 3. Previous research that included analyses of the validation of virtual reality (VR).
Type of Validation% of PapersNumber of Papers
No validation83.33%35
Real7.14%3
Format7.14%3
Immersivity2.38%1
Previous datasets2.38%1
Replication2.38%1

Share and Cite

MDPI and ACS Style

Marín-Morales, J.; Llinares, C.; Guixeres, J.; Alcañiz, M. Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors 2020, 20, 5163. https://doi.org/10.3390/s20185163

AMA Style

Marín-Morales J, Llinares C, Guixeres J, Alcañiz M. Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors. 2020; 20(18):5163. https://doi.org/10.3390/s20185163

Chicago/Turabian Style

Marín-Morales, Javier, Carmen Llinares, Jaime Guixeres, and Mariano Alcañiz. 2020. "Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing" Sensors 20, no. 18: 5163. https://doi.org/10.3390/s20185163

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop