PT - JOURNAL ARTICLE AU - Pierre Karashchuk AU - Katie L. Rupp AU - Evyn S. Dickinson AU - Elischa Sanders AU - Eiman Azim AU - Bingni W. Brunton AU - John C. Tuthill TI - Anipose: a toolkit for robust markerless 3D pose estimation AID - 10.1101/2020.05.26.117325 DP - 2020 Jan 01 TA - bioRxiv PG - 2020.05.26.117325 4099 - http://biorxiv.org/content/early/2020/05/29/2020.05.26.117325.short 4100 - http://biorxiv.org/content/early/2020/05/29/2020.05.26.117325.full AB - Quantifying movement is critical for understanding animal behavior. Advances in computer vision now enable markerless tracking from 2D video, but most animals live and move in 3D. Here, we introduce Anipose, a Python toolkit for robust markerless 3D pose estimation. Anipose consists of four components: (1) a 3D calibration module, (2) filters to resolve 2D tracking errors, (3) a triangulation module that integrates temporal and spatial constraints, and (4) a pipeline to structure processing of large numbers of videos. We evaluate Anipose on four datasets: a moving calibration board, fruit flies walking on a treadmill, mice reaching for a pellet, and humans performing various actions. Because Anipose is built on popular 2D tracking methods (e.g., DeepLabCut), users can expand their existing experimental setups to incorporate robust 3D tracking. We hope this open-source software and accompanying tutorials (anipose.org) will facilitate the analysis of 3D animal behavior and the biology that underlies it.Competing Interest StatementThe authors have declared no competing interest.