Abstract
Video recordings of animals are used for many areas of research such as collective movement, animal space-use, animal censuses and behavioural neuroscience. They provide us with behavioural data at scales and resolutions not possible with manual observations. Many automated methods are being developed to extract data from these high-resolution videos. However, the task of animal detection and tracking for videos taken in natural settings remains challenging due to heterogeneous environments.
We present an open-source end-to-end pipeline called Multi-Object Tracking in Heterogenous environments (MOTHe), a python-based application that uses a basic convolutional neural network for object detection. MOTHe allows researchers with minimal coding experience to track multiple animals in their natural habitats. It identifies animals even when individuals are stationary or partially camouflaged.
MOTHe has a command-line-based interface with one command for each action, for example, finding animals in an image and tracking each individual. Parameters used by the algorithm are well described in a configuration file along with example values for different types of tracking scenario. MOTHe doesn’t require any sophisticated infrastructure and can be run on basic desktop computing units.
We demonstrate MOTHe on six video clips from two species in their natural habitat - wasp colonies on their nests (up to 12 individuals per colony) and antelope herds in four different types of habitats (up to 156 individuals in a herd). Using MOTHe, we are able to detect and track all individuals in these animal group videos. MOTHe’s computing time on a personal computer with 4 GB RAM and i5 processor is 5 minutes for a 30-second long ultra-HD (4K resolution) video recorded at 30 frames per second.
MOTHe is available as an open-source repository with a detailed user guide and demonstrations at Github (https://github.com/tee-lab/MOTHe).
Footnotes
This version has revised figures, references and revised written material to improve readability. Updated links to the Github repository and supplementary videos.