Elsevier

Journal of Biomechanics

Volume 116, 12 February 2021, 110229
Journal of Biomechanics

Estimation of kinematics from inertial measurement units using a combined deep learning and optimization framework

https://doi.org/10.1016/j.jbiomech.2021.110229Get rights and content

Abstract

The difficulty of estimating joint kinematics remains a critical barrier toward widespread use of inertial measurement units in biomechanics. Traditional sensor-fusion filters are largely reliant on magnetometer readings, which may be disturbed in uncontrolled environments. Careful sensor-to-segment alignment and calibration strategies are also necessary, which may burden users and lead to further error in uncontrolled settings. We introduce a new framework that combines deep learning and top-down optimization to accurately predict lower extremity joint angles directly from inertial data, without relying on magnetometer readings. We trained deep neural networks on a large set of synthetic inertial data derived from a clinical marker-based motion-tracking database of hundreds of subjects. We used data augmentation techniques and an automated calibration approach to reduce error due to variability in sensor placement and limb alignment. On left-out subjects, lower extremity kinematics could be predicted with a mean (±STD) root mean squared error of less than 1.27° (±0.38°) in flexion/extension, less than 2.52° (±0.98°) in ad/abduction, and less than 3.34° (±1.02°) internal/external rotation, across walking and running trials. Errors decreased exponentially with the amount of training data, confirming the need for large datasets when training deep neural networks. While this framework remains to be validated with true inertial measurement unit data, the results presented here are a promising advance toward convenient estimation of gait kinematics in natural environments. Progress in this direction could enable large-scale studies and offer new perspective into disease progression, patient recovery, and sports biomechanics.

Introduction

The ability to passively estimate movement kinematics in natural environments with inertial measurement units (IMUs) could transform how we monitor, diagnose, and treat mobility limitations. These devices are now unobtrusive, can flex and bend with the skin, and allow for patient monitoring throughout the day (Patel et al., 2012, Shull et al., 2014, Shull and Damian, 2015, Son et al., 2014). Despite recent progress in hardware miniaturization, turning large multimodal data from wearable sensors into meaningful biomechanical outcomes that can be easily interpreted in the context of healthy and pathological movement remains a key challenge toward their widespread use (Picerno, 2017). Biomechanists and rehabilitation specialists have traditionally characterized gait using joint kinematics and kinetics. While segment inertial data generated by IMUs may also lead to important actionable insights, accurate estimation of joint angles is needed to place future findings in the context of past work.

There are currently no streamlined tools to accurately estimate three-dimensional (3-D) joint kinematics from wearable sensors worn in uncontrolled environments. Strap-down integration of inertial data introduces drift, and sensor fusion algorithms that rely on magnetometer data suffer from ferromagnetic disturbances (de Vries et al., 2009). Solutions that incorporate full-body biomechanical models (Robert-Lachaine et al., 2017a, Robert-Lachaine et al., 2017b, Robert-Lachaine et al., 2020) are currently not portable for anytime, anywhere use, and accuracy over long durations remains to be demonstrated. Additionally, the dependence of most algorithms on accurate sensor-to-segment alignment makes translation difficult for multi-day monitoring outside of the laboratory, given human error in sensor placement. Static and dynamic calibrations (Cutti et al., 2010, Favre et al., 2009, Picerno et al., 2008, Roetenberg et al., 2009) may also add to lack of compliance and increased drop-out rates in remote monitoring studies. Further, static poses or functional calibration trials may not be performed as expected in remote scenarios, especially when users suffer from a mobility-limiting condition.

Deep learning and iterative optimization techniques offer a new opportunity to overcome the limitations of previously proposed approaches for estimating kinematics from IMUs. Deep neural networks are highly efficient in learning non-linear relationships from high-dimensional data, such as dense time series from wearable sensors. A key drawback, however, is that they require large datasets to generate accurate models, and such datasets are scarce in the field of biomechanics. A large dataset that contains both IMU data and ground truth joint kinematics from marker-based motion tracking systems, for example, is currently not available to the research community. In other domains, however, synthetic data have been successfully used to train accurate predictive models (Jaderberg et al., 2014). Additionally, optimization approaches have demonstrated success in improving pose estimation in computer vision applications (von Marcard et al., 2018).

The goal of this study was to build deep learning models for predicting 3-D lower extremity joint kinematics directly from angular velocity and linear acceleration data, circumventing the limitations of magnetometer-dependent algorithms. To generate sufficient data to train such models, we created synthetic inertial data from a marker-based motion capture database of hundreds of subjects collected at a clinical center (Ferber et al., 2014, Osis et al., 2015). We further incorporated data augmentation strategies (Shorten and Khoshgoftaar, 2019) to increase the effective sensor placement variability represented in the data, allowing the developed models to tolerate placement ambiguity without compromising accuracy. Additionally, we used an iterative top-down optimization approach to improve the predictions of the deep learning models.

Section snippets

Data collection and pre-processing

To train the models, we used marker-based motion capture data that were previously collected at the University of Calgary Running Injury Clinic after receiving approval from the University of Calgary’s Conjoint Health Research Ethics Board (Ferber et al., 2016, Jauhiainen et al., 2020, Phinyomark et al., 2018, Pohl et al., 2010). Retro-reflective marker trajectories were collected at 200 Hz using eight high-speed infrared video cameras (Vicon Motion Systems Ltd., Oxford, UK). After a static

Results

Walking kinematics could be predicted with a mean (±STD) RMSE of less than 2.75° (±0.66°), while running kinematics could be predicted with a mean RMSE of less than 3.34 (±1.02°) using the optimized model, along with pseudo-calibration. During walking, flexion/extension was the most accurate degree of freedom, with a mean RMSE of less than 0.97° (±0.38°) across the ankle, knee, and hip joints, followed by ab/adduction with a mean RMSE of less than 2.16° (±0.85°) and internal/external rotation

Discussion

The goal of this study was to introduce a new framework that combines deep neural networks with top-down optimization for prediction of lower extremity kinematics from inertial sensing data. Using this hybrid approach and synthetic inertial data from a large clinical motion capture database, we could predict walking and running kinematics with accuracies that are similar to the reliability of marker-based motion tracking. We also demonstrated that augmentation techniques that increase effective

Acknowledgement

The authors would like to thank Allan Brett for his assistance with the data transfer and for patiently answering questions as the project progressed.

References (36)

  • X. Robert-Lachaine et al.

    Accuracy and repeatability of single-pose calibration of inertial measurement units for whole-body motion analysis

    Gait Posture

    (2017)
  • X. Robert-Lachaine et al.

    Inertial motion capture validation of 3D knee kinematics at various gait speed on the treadmill with a double-pose calibration

    Gait Posture

    (2020)
  • P.B. Shull et al.

    Quantified self and human movement: A review on the clinical impact of wearable sensing and feedback for gait analysis and intervention

    Gait Posture

    (2014)
  • T.-Y. Tsai et al.

    Effects of soft tissue artifacts on the calculated kinematics and kinetics of the knee during stair-ascent

    J. Biomech.

    (2011)
  • G. Wu et al.

    ISB recommendations for standardization in the reporting of kinematic data

    J. Biomech.

    (1995)
  • S. Yang et al.

    Estimation of spatio-temporal parameters for post-stroke hemiparetic gait using inertial sensors

    Gait Posture

    (2013)
  • J.S. Bergstra et al.

    Algorithms for hyper-parameter optimization

    Advances in Neural Information Processing Systems.

    (2011)
  • A.G. Cutti et al.

    ‘Outwalk’: a protocol for clinical gait analysis based on inertial and magnetic sensors

    Med. Biol. Eng. Comput.

    (2010)
  • Cited by (41)

    View all citing articles on Scopus
    1

    Equal contribution

    View full text