Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Embodied virtual reality for the study of real-world motor learning

View ORCID ProfileShlomi Haar, Guhan Sundar, View ORCID ProfileA. Aldo Faisal
doi: https://doi.org/10.1101/2020.03.19.998476
Shlomi Haar
1Brain and Behaviour Lab: Dept. of Bioengineering, Imperial College London, London, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Shlomi Haar
  • For correspondence: s.haar@imperial.ac.uk aldo.faisal@imperial.ac.uk
Guhan Sundar
1Brain and Behaviour Lab: Dept. of Bioengineering, Imperial College London, London, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
A. Aldo Faisal
1Brain and Behaviour Lab: Dept. of Bioengineering, Imperial College London, London, UK
2Brain and Behaviour Lab: Dept. of Computing, Imperial College London, London, UK
3Brain and Behaviour Lab: UKRI Centre for Doctoral Training in AI for Healthcare, Imperial College London, London, UK
4Brain and Behaviour Lab: MRC London Institute of Medical Sciences, Imperial College London, London, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A. Aldo Faisal
  • For correspondence: s.haar@imperial.ac.uk aldo.faisal@imperial.ac.uk
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

Background The motor learning literature focuses on relatively simple laboratory-tasks due to their highly controlled manner and the ease to apply different manipulations to induce learning and adaptation. In recent work we introduced a billiards paradigm and demonstrated the feasibility of real-world neuroscience using wearables for naturalistic full-body motion tracking and mobile brain imaging. Here we developed an embodied virtual reality (VR) environment to our real-world billiards paradigm, which allows us to control the visual feedback for this complex real-world task, while maintaining the sense of embodiment.

Methods The setup was validated by comparing real-world ball trajectories with the embodied VR trajectories, calculated by the physics engine. We then ran our real-world learning protocol in the embodied VR. 10 healthy human subjects played repeated trials of the same billiard shot when they held the physical cue and hit a physical ball on the table while seeing it all in VR.

Results We found comparable learning trends in the embodied VR to those we previously reported in the real-world task.

Conclusions Embodied VR can be used for learning real-world tasks in a highly controlled VR environment which enables applying visual manipulations, common in laboratory-tasks and in rehabilitation, to a real-world full-body task. Such a setup can be used for rehabilitation, where the use of VR is gaining popularity but the transfer to the real-world is currently limited, presumably, due to the lack of embodiment. The embodied VR enables to manipulate feedback and apply perturbations to isolate and assess interactions between specific motor learning components mechanisms, thus enabling addressing the current questions of motor-learning in real-world tasks.

Background

Motor skill learning is a key feature of our development and our daily lives, from a baby learning to crawl, to an adult learning crafts or sports, or undergoing rehabilitation after an injury or a stroke. It is a complex process, which involves movement in many degrees of freedom (DoF) and multiple learning mechanisms. Yet the majority of motor learning literature focuses on simple lab-based tasks with limited DoF such as force-field adaptations [e.g. 1–4], visuomotor perturbations [e.g. 5–9], and sequence-learning of finger tapping or pinching tasks [e.g. 10–13]. Real-world neuroscience approach studies neurobehavioral processes in natural behavioral settings [14–17]. We recently presented a naturalistic real-world motor learning paradigm, using wearables for full body motion tracking and EEG for mobile brain imaging, while making people perform actual real-world tasks, such as playing the competitive sport of pool-table billiards [18,19]. We showed that motor learning is a full body process that involves multiple learning mechanisms, and different subjects might prefer one over the other.

Now we want to introduce manipulations of real-world tasks to establish causality. While the study of real-world tasks takes us closer to understanding real-world motor-learning, it is lacking the key advantage of lab-based toy-tasks (which made them so popular) of highly controlled manipulations of known variables, to isolate specific movement/learning components. Virtual Reality (VR) provides a handy solution to this problem, enabling to apply controlled manipulations in a real-world task [20]. VR has clear benefits such as ease of controlling repetition, feedback, and motivation, as well as overall advantages in safety, time, space, equipment, cost efficiency, and ease of documentation [21,22]. Thus, it is commonly used in rehabilitation after stroke [23,24] or brain injury [25,26], and for Parkinson’s disease [27,28]. In simple sensorimotor lab-based motor learning paradigms, VR training showed to have equivalent results to those of real training [29–31], though adaption in VR appears to be more reliant on explicit/cognitive strategies [31].

While VR is very good for visual immersion, it is often lacking the Sense of Embodiment (SoE). SoE refers to the senses associated with being inside, having, and controlling a body [32]. SoE requires a sense of self-location, agency, and body ownership [33–35]. This study aims to set and validate an Embodied Virtual Reality (EVR) for real-world motor-learning, which would enable to apply highly controlled manipulations in a real-world task. We develop an EVR to our billiards paradigm [18] by synchronizing the positions of the real-world billiards objects (table, cue-stick, balls) into the VR environment using motion capture (MoCap). Thus, the participant can play with a physical cue and a physical ball on the physical pool-table while seeing it all in VR (https://youtu.be/m68_UYkMbSk). We ran our real-world billiards experimental protocol in this novel EVR to explore the similarities and differences in learning between the real-world paradigm and its EVR mockup.

Methods

Experimental Setup

Our EVR experimental setup (Figure 1A) was composed of a real-world environment of a physical pool table, a VR (Unity3d, HTC Vive) environment of the game (Figure 1B), and MoCap to link between the two environments (Figure 1C). The positions of the virtual billiards table and balls the subjects saw in the VR were matched with their respective real-world positions during calibration, and the cue-stick trajectory was streamed into the VR using MoCap (Optitrack, Motiv). This allowed subjects to engage in the VR task the same way they would in the real-world. Data collection for game object trajectories was done directly in Unity3d, and the full body movement is recorded with a suit of IMUs (inertial measurement units).

Figure 1.
  • Download figure
  • Open in new tab
Figure 1. Experimental setup and calibration.

(A) 10 right-handed healthy subjects performed 300 repeated trials of billiards shots in Embodied Virtual Reality (EVR). Green arrows mark the MoCap markers used to track and stream the cue stick movement into the EVR environment (B) Scene view in the EVR. Subjects were instructed to hit the cue ball (white), which was a physical ball on the table (in A), in attempt to shoot the virtual target ball (red) towards the far-left corner. (C) For environments calibration, MoCap markers were attached to the HTC Vive controllers which were placed in the pool-table’s pockets with additional solo marker in the cue ball position.

Real-world objects included the same billiards table, cue ball, target ball, and cue stick, used in our real-world billiard study [18]. Subjects were unable to see anything in the real-world environment, they could only see a virtual projection of the game objects. They were however able to receive tactile feedback from the objects by interacting with them.

Four MoCap cameras (Optitrack, Motiv) with Motiv software were used to stream the position of the real-world cue stick into the VR using 4 markers on the stick (Figure 1A). The position of each marker was streamed to Unity3d using the NATNET Optitrack Unity3d Client plugin and associated Optitrack Streaming Client script edited for the application. The positions were transformed from the Optitrack environment to the Unity3d environment with a transformation matrix derived during calibration. The cue stick asset was then reconstructed in VR using known geometric quantities of the cue stick and marker locations (Figure 1B). The placement of markers on the cue stick, as well as the position and orientation of the cameras were key to provide consistent marker tracking and accurate control in VR without significantly constraining the subject movement. The rotation of the cue stick or the position of the subject can interfere with the line of sight between the markers on the cue stick and the cameras. Thus, to prevent errors in cue tracking, if markers become untracked the cue stick disappears from the visual scene until proper tracking is resumed.

The VR billiards environment was built with the Unity3d physics engine. The head-mounted display (HMD) used was the HTC Vive Pro. Frame rate for VR display was 90 Hz. The Unity3d assets (billiards table, cue stick, balls) were taken from an open source Unity3d project [36] and scaled to match the dimensions of the real-world objects. Scripts developed in C# to manage game object interactions, apply physics, and record data. Unity3d software was used to develop custom physics for game collisions. Cue stick – cue ball collision force in Unity3d is computed from the median velocity and direction of the cue stick in the 10 frames (∼0.11 seconds) before contact. Sensory and auditory feedback comes from the real-world objects for this initial collision. Cue ball – target ball collision is hard coded as a perfect inelastic collision. Billiard ball sound effect is outputted to the Vive headphones during this collision. The default Unity3d engine was used for ball dynamics, with specific mass and friction parameters tuned to match as closely as possible to real-world ball behavior. For the game physics validation, the physical cue ball on the pool table was tracked with a high-speed camera (Dalsa Genie Nano) and its trajectories were compared with those of the VR ball.

For environments calibration, the ‘y-axis’ was set directly upwards (orthogonal to the ground plane) in both the Unity3d and Optitrack environments during their respective initial calibrations. This allows us to only require a 2D (x-z) transformation between environments, using a linear ratio to scale the height. The transformation matrix was determined by matching the positions of known coordinates in both Unity3d and Optitrack environments. We attached markers to the Vive controllers and during calibration mode set them in the corner pockets of the table and placed a solo marker on the cue ball location (Figure 1C), to compute the transformation matrix as well as position and scale of the real-world table. This transformation matrix was then used to transform points from the Optitrack environment into the Unity3d space.

Experimental Design

10 right-handed healthy human volunteers with normal or corrected-to-normal visual acuity (4 women and 6 men, aged 24±2) participated in the study following the experimental protocol from Haar et al [18]. The volunteers, who had little to no previous experience with playing billiards, performed 300 repeated trials in the EVR setup where the cue ball (white) and the target ball (red) were placed in the same locations and the subject was asked to shoot the target ball towards the pocket of the far-left corner (Figure 1B). VR trials ended when ball velocities fell below threshold value, and the next trial began when the subject moved the cue stick tip to a set distance away from the cue ball start position. The trials were split into 6 sets of 50 trials with a short break in-between. For the data analysis we further split each set into two blocks of 25 trials each, resulting in 12 blocks. During the entire learning process, we recorded the subjects’ full body movements with a motion tracking ‘suit’ of 17 wireless inertial measurement units (IMUs). Movement of all game objects in Unity3d (most notably ball and cue stick trajectories relative to the table) were captured in every frame in 90Hz sampling.

Full-Body Motion Tracking

Kinematic data was recorded at 60 Hz using a wearable motion tracking ‘suit’ of 17 wireless IMUs (Xsens MVN Awinda, Xsens Technologies BV, Enschede, The Netherlands). Data acquisition was done via a graphical interface (MVN Analyze, Xsens technologies BV, Ensched, The Netherlands). The Xsens joint angles and position data were exported as XML files and analyzed using a custom software written in MATLAB (R2017a, The MathWorks, Inc., MA, USA). The Xsens full body kinematics were extracted in joint angles in 3 degrees of freedom for each joint that followed the International Society of Biomechanics (ISB) recommendations for Euler angle extractions of Z (flexion/extension), X (abduction/adduction) Y (internal/external rotation).

Movement Velocity Profile Analysis

From the joint angles we extracted the velocity profiles of all joints in all trials. We defined the peak of the trial as the peak of the average absolute velocity across the DoFs of the right shoulder and the right elbow. We aligned all trials around the peak of the trial and cropped a window of 1 sec around the peak for the analysis of joint angles and velocity profiles.

Results

Ball trajectories validation

To validate how well the billiards shot in the EVR resembles the same shot in real life, the cue ball trajectories of 100 shots in various directions (−50°< ø < 50° when 0 is straight forward) were compared between the two environments. The cue ball angles were perfectly correlated (Pearson correlation r=0.99) and the root mean squared error (RMSE) was below 3 degrees (RMSE = 2.85). Thus, the angle of the virtual ball in the EVR, which defines the performance in this billiards task, was very consistent with the angle of the real-world ball (Figure 2A). The velocities were also highly correlated (Pearson correlation r=0.83) between the environments but the ball velocities in the VR were slightly slower than on the real pool-table (Figure 2B), leading to a RMSE of 1.03 m/s.

Figure 2.
  • Download figure
  • Open in new tab
Figure 2. Ball trajectories validation.

Cue ball trajectories comparison between VR and real-world for 100 billiard shots at various directions (−50°< ø < 50° when 0 is straight forward). (A) Cue ball angles, each dot is a trial. (B) Max velocity of the cue ball during each trial. The regression line is in black with its 95% CI in doted lines. Identity line is in light gray.

Motor Learning Experiment

To compare the learning of the billiards shot in our EVR to the learning in real-life, we ran the same experimental protocol as in Haar et al [18] and compared mean subjects performance. Accordingly, the trials were divided into blocks of 25 trials each (each experimental set of 50 trials was divided into two blocks to increase resolution) to assess the performance. Over blocks, there is a gradual decay in the mean directional absolute error (Figure 3A). The error was defined as an absolute angular difference between the target ball movement vector direction and the desired direction to land the target ball in the center of the pocket. The decay of error over trials is the clearest signature of learning in the task. Accordingly, success rates are increasing over blocks (Figure 3B). We also see a decay in inter-subject variability over learning, represented by the decrease in the size of the error bar of the directional error over time (Figure 3A). These learning trends in the directional error and success rates are similar to those reported in the real world. Nevertheless, there are clear differences in the learning curve. In the EVR learning occurs slower than in the real-world task, and subjects’ performance are worse. The most striking difference between the environments is in the intertrial variability (Figure 3C). In the real-world task there was a clear decay in intertrial variability throughout the experiment, whereas in EVR we see no clear trend. Corrected intertrial variability (Figure 3D), calculated to correct for learning happening within the block [18], also showed no learning trend.

Figure 3.
  • Download figure
  • Open in new tab
Figure 3. Task performance in EVR vs Real-world.

(A) The mean absolute directional error of the target-ball, (B) The success rate, (C) directional variability, and (D) directional variability corrected for learning (see text). (A-D) presented over blocks of 25 trials. Solid and dashed lines are for EVR and Real-world respectively.

The full body movements were analyzed over the velocity profiles of all joints, as those are less sensitive to potential drifts in the IMUs and more robust and reproducible in natural behavior across subjects and across trials [18,37].

The velocity profiles of the different joints in the EVR showed that the movement is in the right arm, as expected. The velocity profiles of the right arm showed the same changes following learning as in the real-world task. The shoulder velocities showed a decrease from the initial trials to the trials of the learning plateau, suggesting less shoulder movement; while the elbow rotation shows an increase in velocity over learning (Figure 4). The covariance matrix over the velocity profiles of the different joints, averaged across blocks of trials of all subjects, emphasizes this trend. Over the first block it shows that most of the variance in the movement is in the right shoulder while in the 9th block (trials 201-225, the beginning of the learning plateau) there is an overall similar structure of the covariance matrix, but with a strong decrease in the shoulder variance and a strong increase in the variance of right elbow rotation (Figure 5A). This is a similar trend to the one observed in the real-world task, and even more robust.

Figure 4.
  • Download figure
  • Open in new tab
Figure 4. Velocity profiles in EVR vs Real-world.

Velocity profiles in 3 degrees of freedom (DoF) for each joint of the right arm joints. Blue lines are the profiles during the first block (trials 1-25), and red lines are the velocity profiles after learning plateaus, during the ninth block (trials 201-225). Solid and dashed lines are for EVR and Real-world respectively.

Figure 5.
  • Download figure
  • Open in new tab
Figure 5. Variance and Complexity comparison.

(A) The variance covariance matrix of the right arm joints velocity profiles in EVR, averaged across subjects and trials over the first block, second block and the ninth block (after learning plateaus). (B) The trial-by-trial generalized variance (GV), with a double-exponential fit (red curve). (C) The number of principal components (PCs) that explain more than 1% of the variance in the velocity profiles of all joints in a single trial, with an exponential fit (red curve). (D) The manipulative complexity (Belić and Faisal, 2015), with an exponential fit (red curve). (B-D) Averaged across all subjects over all trials. Grey dots are the trial averages for the EVR data. Solid and dashed red lines are fits for EVR and Real-world respectively.

The generalized variance (GV; the determinant of the covariance matrix [38]) over the velocity profiles of all joints was lower in the EVR than in real-world but showed the exact same trend: increase fast over the first ∼30 trials and later decreased slowly (Figure 5B), suggesting active control of the exploration-exploitation trade-off. The covariance (Figure 5A) shows that the changes in the GV were driven by initial increase followed by a decrease in the variance of the right shoulder. Like in the real-world, in the EVR as well the internal/external rotation of the right elbow showed a continuous increase in its variance, which did not follow the trend of the GV.

Principal component analysis (PCA) across joints for the velocity profiles per trial for each subject showed that in the EVR subjects used more degrees of freedom in their movement than in the real-world task (Figure 5C&D). While in both environments, in all trials, ∼90% of the variance can be explained by the first PC, there is a slow but consistent rise in the number of PCs that explain more than 1% of the variance in the joint velocity profiles (Figure 5C). The manipulative complexity, suggested by Belić and Faisal [39] as way to quantify complexity for a given number of PCs on a fixed scale (C = 1 implies that all PCs contribute equally, and C = 0 if one PC explains all data variability), showed the same trend (Figure 5D). This suggests that, in both environments, over trials subjects use more degrees of freedom in their movement; and in EVR they used slightly more DoF than in the real-world task.

As a measure of task performance in body space we use the Velocity Profile Error (VPE), as in Haar et al [18]. VPE is defined by the mean correlation distance (one minus Pearson correlation coefficient between the velocity profile of each joint in each trial to the velocity profiles of that joint in all successful trials. Like in the real-world, in the EVR environment we also found that VPE shows a clear pattern of decay over trials in an exponential learning curve for all joints (Figure 6A). Intertrial variability in joint movement was also measured over the VPEs in each block. Unlike the real-world task, where learning was evident in the decay over learning of the VPE intertrial variability, in the EVR there was no such decay in most joints (Figure 6B). This is in line with the lack of decay in the intertrial variability of the directional error (Figure 3C&D).

Figure 6.
  • Download figure
  • Open in new tab
Figure 6. Learning over Joints.

Velocity Profile Error (VPE) and Intertrial variability reduction across all joints in the EVR task. (A) The trial-by-trial VPE for all 3 DoF of all joints, averaged across all subjects, with an exponential fit. The time constants of the fits are reported under the title color coded for the DoF (blue: flexion/extension; red: abduction/adduction; green: internal/external rotation). (B) VPE intertrial variability over blocks of 25 trials, averaged across all subjects.

Discussion

In this paper, we present a novel embodied VR framework capable of providing controlled manipulations in order to better study naturalistic motor learning in a complex real-world setting. By interfacing real-world objects into the VR environment, we were able to provide the SoE subjects experience in the real-world, which is missing in most VR environments. We demonstrate the similarities and differences between the learning in the EVR environment and learning in the real-world environment. This is the first study to directly compare full body motor learning in a real-world task between VR and the real-world.

There is much evidence that humans can learn motor skills in VR and transfer learning from the VR to the real-world, but also a real need to enhance it to make VR really useful for rehabilitation applications [for review see 20]. Theory suggests that transfer should be enhanced when VR simulates the real-world as closely as possible. The physical interaction with the real-world objects in our EVR is adding haptic information about interaction forces with virtual objects, which is lacking in most VR setups, and should enhance learning and transfer.

Comparison of the motor learning in the EVR to learning in the real-world task showed many similarities but also intriguing and differences. The main trends over learning that were found in the real-world task include the decrease in directional error, decrease in directional intertrial variability, decrease in shoulder movement and increase in elbow rotation, decrease in joints VPE, and decrease in joints VPE intertrial variability [18]. In the EVR environment we found the same general trends for all these metrics, except for those of the intertrial variability. We do however see a systematic difference in learning rates between VR and real-world when comparing the directional error and VPE trends. Across the board we see less learning in VR compared to the real-world.

Decay in intertrial variability over learning is considered to be a feature of motor skill learning and specifically of motor acuity [40,41]. The lack of such decay and the overall differences in the learning curve suggest potential differences in the learning mechanisms used by the subjects who learned the task in the VR. These differences may be attributed to the fact all subjects are completely naïve to the EVR environment and must learn not just the billiards task but also how to operate in the VR.

Limitations

The comparison of the ball trajectories between the EVR and the real-world environments highlight the similarity in the ball directions, which is the main parameter that determines task error and success. Nevertheless, there were significant velocity differences between the environments. These velocity differences were set to optimize subjects’ experience, accounting for deviations in ball physics due to friction, spin, and follow through which were not modeled in the VR. Due to these deviations, in VR, the cue ball reaches its max velocity almost instantaneously while in the real-world there is an acceleration phase. For the current version of the setup we neglected these differences, assuming it would not affect the SoE of very naïve pool players. Future studies, testing experts on the setup, would require more accurate game physics in the EVR. Another limitation of the current EVR setup is that subjects are unable to see their own limbs in the environment, whereas in the real-world the positions of the subject’s own limbs may influence how the task is learned. We can probably neglect this difference due to the extensive literature suggesting that learning is optimized by external focus of attention [for review see 42]. Thus, the lack of body vision should not significantly affect learning. Lastly, our setup is limited to visual perturbations and cannot be used to manipulate haptic force feedback.

Conclusions

In this study we have developed an embodied VR framework capable of applying visual feedback manipulations for a naturalistic free-moving real-world skill task. We have demonstrated the similarities in learning for a billiards shot between the EVR and the real-world and have confirmed the findings that motor learning is holistic. By manipulating the visual feedback in the EVR we can now further investigate the relationships between the distinct learning strategies employed by humans for this real-world motor skill. Such a setup can also be highly useful for rehabilitation, as it solves the problem of lack of embodiment which is limiting the transfer to the real-world in regular VR setup. Considering the potential of VR based rehabilitation and increasing popularity of VR in rehabilitation [20], the importance of EVR is clear.

Ethics statement

All experimental procedures were approved by Imperial College Research Ethics Committee and performed in accordance with the declaration of Helsinki. All subjects gave informed consent prior to participating in the study.

Availability of data and materials

The datasets used and/or analyzed in the current study are available from the corresponding author on reasonable request

Competing interests

The authors declare no competing financial interests

Funding

The study was enabled by financial support to a Royal Society-Kohn International Fellowship (NF170650; SH & AAF).

Authors’ contributions

SH and AAF conceived and designed the study; SH, GS and AAF developed the experimental setup; GS acquired the data; GS and SH analyzed the data; SH, GS and AAF interpreted the data; SH drafted the paper; SH and AAF revised the paper

Acknowledgements

We thank our participants for taking part in the study.

Abbreviations

EVR
Embodied virtual reality;
DoF
Degrees of freedom;
MoCap
Motion capture;
SoE
Sense of Embodiment;
IMU
inertial measurement unit;
HMD
head-mounted display;
RMSE
root mean squared error;
GV
generalized variance;
PCA
Principal component analysis;
VPE
Velocity profile error;

References

  1. 1.
    Smith MA, Ghazizadeh A, Shadmehr R. Interacting adaptive processes with different timescales underlie short-term motor learning. PLoS Biol [Internet]. 2006 [cited 2014 Dec 14];4:e179. Available from: http://dx.plos.org/10.1371/journal.pbio.0040179
    OpenUrl
  2. 2.
    Shadmehr R, Mussa-Ivaldi FA. Adaptive representation of dynamics during learning of a motor task. J Neurosci [Internet]. 1994 [cited 2017 Oct 24];14:3208–24. Available from: http://www.ncbi.nlm.nih.gov/pubmed/8182467
    OpenUrl
  3. 3.
    Diedrichsen J, Hashambhoy Y, Rane T, Shadmehr R. Neural correlates of reach errors. J Neurosci [Internet]. 2005 [cited 2014 Nov 3];25:9919–31. Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1479774&tool=pmcentrez&rendertype=abstract
    OpenUrl
  4. 4.
    Howard IS, Wolpert DM, Franklin DW. The Value of the Follow-Through Derives from Motor Learning Depending on Future Actions. Curr Biol [Internet]. Cell Press; 2015 [cited 2018 Jul 23];25:397–401. Available from: https://www.sciencedirect.com/science/article/pii/S0960982214016406
    OpenUrl
  5. 5.
    Taylor J a, Krakauer JW, Ivry RB. Explicit and implicit contributions to learning in a sensorimotor adaptation task. J Neurosci [Internet]. 2014;34:3023–32. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24553942
    OpenUrl
  6. 6.
    Krakauer JW, Pine Z, Ghilardi M, Ghez C. Learning of visuomotor transformations for vectorial planning of reaching trajectories. J Neurosci [Internet]. 2000 [cited 2014 Dec 4];20:8916–24. Available from: http://www.jneurosci.org/content/20/23/8916.short
    OpenUrl
  7. 7.
    Mazzoni P, Krakauer J. An Implicit Plan Overrides an Explicit Strategy during Visuomotor Adaptation. J Neurosci. 2006;26:3642–5.
    OpenUrlAbstract/FREE Full Text
  8. 8.
    Haar S, Donchin O, Dinstein I. Dissociating Visual and Motor Directional Selectivity Using Visuomotor Adaptation. J Neurosci [Internet]. 2015 [cited 2015 Apr 30];35:6813–21. Available from: http://www.jneurosci.org./content/35/17/6813.full
    OpenUrl
  9. 9.
    Bromberg Z, Donchin O, Haar S. Eye movements during visuomotor adaptation represent only part of the explicit learning. eNeuro [Internet]. Society for Neuroscience; 2019 [cited 2019 Dec 20];6:1–12. Available from: http://www.ncbi.nlm.nih.gov/pubmed/31776177
    OpenUrl
  10. 10.
    Clerget E, Poncin W, Fadiga L, Olivier E. Role of Broca’s Area in Implicit Motor Skill Learning: Evidence from Continuous Theta-burst Magnetic Stimulation. J Cogn Neurosci. 2012;24:80–92.
    OpenUrlCrossRefPubMedWeb of Science
  11. 11.
    Ma L, Narayana S, Robin DA, Fox PT, Xiong J. Changes occur in resting state network of motor system during 4weeks of motor skill learning. Neuroimage [Internet]. Elsevier Inc.; 2011;58:226–33. Available from: http://dx.doi.org/10.1016/j.neuroimage.2011.06.014
    OpenUrl
  12. 12.
    Reis J, Schambra HM, Cohen LG, Buch ER, Fritsch B, Zarahn E, et al. Noninvasive cortical stimulation enhances motor skill acquisition over multiple days through an effect on consolidation. Proc Natl Acad Sci U S A [Internet]. 2009;106:1590–5. Available from: http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2635787&tool=pmcentrez&rendertype=abstract
    OpenUrl
  13. 13.
    Yokoi A, Arbuckle SA, Diedrichsen J. The role of human primary motor cortex in the production of skilled finger sequences. J Neurosci [Internet]. Society for Neuroscience; 2018 [cited 2018 Jul 23];38:1430–42. Available from: http://www.jneurosci.org/content/jneuro/38/6/1430.full.pdf
    OpenUrl
  14. 14.↵
    1. Petraglia MD,
    Faisal A, Stout D, Apel J, Bradley B. The Manipulative Complexity of Lower Paleolithic Stone Toolmaking. Petraglia MD, editor. PLoS One [Internet]. Public Library of Science; 2010 [cited 2019 Mar 6];5:e13718. Available from: https://dx.plos.org/10.1371/journal.pone.0013718
    OpenUrl
  15. 15.
    Hecht EE, Gutman DA, Khreisheh N, Taylor S V., Kilner J, Faisal AA, et al. Acquisition of Paleolithic toolmaking abilities involves structural remodeling to inferior frontoparietal regions. Brain Struct Funct [Internet]. Springer Berlin Heidelberg; 2014;220:2315–31. Available from: http://dx.doi.org/10.1007/s00429-014-0789-6
    OpenUrl
  16. 16.
    Xiloyannis M, Gavriel C, Thomik AAC, Faisal AA. Gaussian Process Autoregression for Simultaneous Proportional Multi-Modal Prosthetic Control with Natural Hand Kinematics. IEEE Trans Neural Syst Rehabil Eng. Institute of Electrical and Electronics Engineers Inc.; 2017;25:1785–801.
    OpenUrl
  17. 17.↵
    Rito Lima I, Haar S, Di Grassi L, Faisal AA. Neurobehavioural signatures in race car driving. bioRxiv [Internet]. Cold Spring Harbor Laboratory; 2019 [cited 2019 Dec 20];860056. Available from: https://www.biorxiv.org/content/10.1101/860056v1.abstract
  18. 18.↵
    Haar S, van Assel CM, Faisal AA. Kinematic signatures of learning that emerge in a real-world motor skill task. bioRxiv [Internet]. Cold Spring Harbor Laboratory; 2019 [cited 2020 Mar 2];612218. Available from: https://www.biorxiv.org/content/10.1101/612218v3
  19. 19.↵
    Haar S, Faisal AA. Neural biomarkers of multiple motor-learning mechanisms in a real-world task. bioRxiv [Internet]. Cold Spring Harbor Laboratory; 2020 [cited 2020 Mar 6];2020.03.04.976951. Available from: https://www.biorxiv.org/content/10.1101/2020.03.04.976951v1
  20. 20.↵
    Levac DE, Huber ME, Sternad D. Learning and transfer of complex motor skills in virtual reality: a perspective review. J Neuroeng Rehabil [Internet]. BioMed Central; 2019 [cited 2019 Nov 5];16:121. Available from: https://jneuroengrehab.biomedcentral.com/articles/10.1186/s12984-019-0587-8
    OpenUrl
  21. 21.↵
    Holden MK, Todorov E. Use of Virtual Environments in Motor Learning and Rehabilitation. Handb Virtual Environ Des Implementation, Appl [Internet]. 2002;44:1–35. Available from: http://web.mit.edu/bcs/bizzilab/publications/holden2002b.pdf
    OpenUrl
  22. 22.↵
    Rizzo A, Buckwalter JG, van der Zaag C, Neumann U, Thiebaux M, Chua C, et al. Virtual Environment applications in clinical neuropsychology. Proc - Virtual Real Annu Int Symp. IEEE; 2000;63–70.
  23. 23.↵
    Holden MK. Virtual environments for motor rehabilitation: Review. Cyberpsychology Behav. 2005. p. 187–211.
  24. 24.↵
    Jack D, Boian R, Merians AS, Tremaine M, Burdea GC, Adamovich S V., et al. Virtual reality-enhanced stroke rehabilitation. IEEE Trans Neural Syst Rehabil Eng. 2001;9:308–18.
    OpenUrlCrossRefPubMedWeb of Science
  25. 25.↵
    Zhang L, Abreu BC, Seale GS, Masel B, Christiansen CH, Ottenbacher KJ. A virtual reality environment for evaluation of a daily living skill in brain injury rehabilitation: Reliability and validity. Arch Phys Med Rehabil. 2003;84:1118–24.
    OpenUrlCrossRefPubMedWeb of Science
  26. 26.↵
    Levin MF, Weiss PL, Keshner EA. Emergence of Virtual Reality as a Tool for Upper Limb Rehabilitation: Incorporation of Motor Control and Motor Learning Principles. Phys Ther [Internet]. Narnia; 2015 [cited 2019 Oct 15];95:415–25. Available from: https://academic.oup.com/ptj/article-lookup/doi/10.2522/ptj.20130579
    OpenUrl
  27. 27.↵
    Mendes FA dos S, Pompeu JE, Lobo AM, da Silva KG, Oliveira T de P, Zomignani AP, et al. Motor learning, retention and transfer after virtual-reality-based training in Parkinson’s disease –effect of motor and cognitive demands of games: a longitudinal, controlled clinical study. Physiotherapy [Internet]. Elsevier; 2012 [cited 2019 Oct 15];98:217–23. Available from: https://www.sciencedirect.com/science/article/pii/S003194061200051X
    OpenUrl
  28. 28.↵
    Mirelman A, Maidan I, Herman T, Deutsch JE, Giladi N, Hausdorff JM. Virtual Reality for Gait Training: Can It Induce Motor Learning to Enhance Complex Walking and Reduce Fall Risk in Patients With Parkinson’s Disease? Journals Gerontol Ser A Biol Sci Med Sci [Internet]. Narnia; 2011 [cited 2019 Oct 15];66A:234–40. Available from: https://academic.oup.com/biomedgerontology/article-lookup/doi/10.1093/gerona/glq201
    OpenUrl
  29. 29.↵
    Rose FD, Attree EA, Brooks BM, Parslow DM, Penn PR. Training in virtual environments: transfer to real world tasks and equivalence to real task training. Ergonomics [Internet]. Taylor & Francis Group ; 2000 [cited 2019 Oct 15];43:494–511. Available from: https://www.tandfonline.com/doi/full/10.1080/001401300184378
    OpenUrl
  30. 30.
    Carter AR, Foreman MH, Martin C, Fitterer S, Pioppo A, Connor LT, et al. Inducing visuomotor adaptation using virtual reality gaming with a virtual shift as a treatment for unilateral spatial neglect. J Intellect Disabil - Diagnosis Treat. 2016;4:170–84.
    OpenUrl
  31. 31.↵
    Anglin JM, Sugiyama T, Liew SL. Visuomotor adaptation in head-mounted virtual reality versus conventional training. Sci Rep. 2017;7.
  32. 32.↵
    Lindgren R, Johnson-Glenberg M. Emboldened by Embodiment: Six Precepts for Research on Embodied Learning and Mixed Reality. Educ Res. 2013;42:445–52.
    OpenUrl
  33. 33.↵
    Arzy S, Thut G, Mohr C, Michel CM, Blanke O. Neural basis of embodiment: Distinct contributions of temporoparietal junction and extrastriate body area. J Neurosci. 2006;26:8074–81.
    OpenUrlAbstract/FREE Full Text
  34. 34.
    Longo MR, Schüür F, Kammers MPM, Tsakiris M, Haggard P. What is embodiment? A psychometric approach. Cognition [Internet]. Elsevier; 2008 [cited 2019 Oct 15];107:978–98. Available from: https://www.sciencedirect.com/science/article/pii/S0010027708000061
    OpenUrl
  35. 35.↵
    Kilteni K, Groten R, Slater M. The Sense of Embodiment in Virtual Reality. Presence Teleoperators Virtual Environ [Internet]. The MIT Press ; 2012 [cited 2019 Nov 21];21:373–87. Available from: http://www.mitpressjournals.org/doi/10.1162/PRES_a_00124
    OpenUrl
  36. 36.↵
    Rehm F. Unity 3D Pool [Internet]. GitHub Repos. 2015. Available from: https://github.com/fgrehm/pucrs-unity3d-pool
  37. 37.↵
    Thomik AAC. On the structure of natural human movement [Internet]. Imperial College London; 2016 [cited 2019 Apr 16]. Available from: https://spiral.imperial.ac.uk/handle/10044/1/61827
  38. 38.↵
    Wilks SS. Certain Generalizations in the Analysis of Variance. Biometrika [Internet]. 1932 [cited 2016 Sep 15];24:471. Available from: http://www.jstor.org/stable/2331979?origin=crossref
    OpenUrl
  39. 39.↵
    Belić JJ, Faisal AA. Decoding of human hand actions to handle missing limbs in neuroprosthetics. Front Comput Neurosci [Internet]. 2015;9:27. Available from: http://journal.frontiersin.org/Article/10.3389/fncom.2015.00027/abstract
    OpenUrl
  40. 40.↵
    Shmuelof L, Krakauer JW, Mazzoni P. How is a motor skill learned? Change and invariance at the levels of task success and trajectory control. J Neurophysiol. 2012;108:578–94.
    OpenUrlCrossRefPubMedWeb of Science
  41. 41.↵
    Krakauer JW, Hadjiosif AM, Xu J, Wong AL, Haith AM. Motor learning. Compr Physiol [Internet]. Wiley; 2019 [cited 2019 Sep 23];9:613–63. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/cphy.c170043
    OpenUrl
  42. 42.
    Wulf G. Attentional focus and motor learning: a review of 15 years. Int Rev Sport Exerc Psychol [Internet]. Routledge; 2013 [cited 2019 Nov 25];6:77–104. Available from: http://www.tandfonline.com/doi/abs/10.1080/1750984X.2012.723728
    OpenUrl
Back to top
PreviousNext
Posted March 20, 2020.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Embodied virtual reality for the study of real-world motor learning
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Embodied virtual reality for the study of real-world motor learning
Shlomi Haar, Guhan Sundar, A. Aldo Faisal
bioRxiv 2020.03.19.998476; doi: https://doi.org/10.1101/2020.03.19.998476
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
Embodied virtual reality for the study of real-world motor learning
Shlomi Haar, Guhan Sundar, A. Aldo Faisal
bioRxiv 2020.03.19.998476; doi: https://doi.org/10.1101/2020.03.19.998476

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (3505)
  • Biochemistry (7346)
  • Bioengineering (5323)
  • Bioinformatics (20260)
  • Biophysics (10016)
  • Cancer Biology (7743)
  • Cell Biology (11300)
  • Clinical Trials (138)
  • Developmental Biology (6437)
  • Ecology (9951)
  • Epidemiology (2065)
  • Evolutionary Biology (13321)
  • Genetics (9361)
  • Genomics (12583)
  • Immunology (7701)
  • Microbiology (19021)
  • Molecular Biology (7441)
  • Neuroscience (41036)
  • Paleontology (300)
  • Pathology (1229)
  • Pharmacology and Toxicology (2137)
  • Physiology (3160)
  • Plant Biology (6860)
  • Scientific Communication and Education (1272)
  • Synthetic Biology (1896)
  • Systems Biology (5311)
  • Zoology (1089)