TY - JOUR T1 - Fusion of Video and Inertial Sensing Data via Dynamic Optimization of a Biomechanical Model JF - bioRxiv DO - 10.1101/2022.11.15.516673 SP - 2022.11.15.516673 AU - Owen Pearl AU - Soyong Shin AU - Ashwin Godura AU - Sarah Bergbreiter AU - Eni Halilaj Y1 - 2022/01/01 UR - http://biorxiv.org/content/early/2022/11/17/2022.11.15.516673.1.abstract N2 - Inertial sensing and computer vision are promising alternatives to traditional optical motion tracking, but until now these data sources have been explored either in isolation or fused via unconstrained optimization, which may not take full advantage of their complementary strengths. By adding physiological plausibility and dynamical robustness to a proposed solution, biomechanical modeling may enable better fusion than unconstrained optimization. To test this hypothesis, we fused video and inertial sensing data via dynamic optimization with a nine degree-of-freedom model and investigated when this approach outperforms video-only, inertial-sensing-only, and unconstrained-fusion methods. We used both experimental and synthetic data that mimicked different ranges of video and inertial measurement unit (IMU) data noise. Fusion with a dynamically constrained model improved estimation of lower-extremity kinematics by a mean ± std root-mean-square error of 6.0° ± 1.2° over the video-only approach and estimation of joint centers by 4.5 ± 2.8 cm over the IMU-only approach. It consistently outperformed single-modality approaches across different noise profiles. When the quality of video data was high and that of inertial data was low, dynamically constrained fusion improved joint kinematics by 3.7° ± 1.2° and joint centers by 1.9 ± 0.5 cm over unconstrained fusion, while unconstrained fusion was advantageous by 3.0° ± 1.4° and 1.2 ± 0.7 cm in the opposite scenario. These findings indicate that complementary modalities and techniques can improve motion tracking by clinically meaningful margins and that data quality and computational complexity must be considered when selecting the most appropriate method for a particular application.Competing Interest StatementThe authors have declared no competing interest. ER -