Abstract
Developments in automated animal behavioural analysis software are increasing the efficiency of data collection and improving the standardization of behavioural measurements. There are now several open-source tools for tracking laboratory animals, but often these are only accurate under limited conditions (e.g. uniform lighting and background, uncluttered scenes, unobstructed focal animal). Tracking fish presents a particular challenge for these tools because movement at the water’s surface introduces significant noise. Partial occlusion of the focal animal can also be troublesome, particularly when tracking the whole organism. But identifying the position of an animal is only part of the task – analysing the movement of the animal relative to their environment and experimental context is often what provides information about their behaviour. Therefore, the automated detection of physical objects and boundaries would also be beneficial, but this feature is not commonly incorporated into existing tracking software. Here we describe a video processing method that uses a range of computer vision algorithms (e.g. object detector and tracker, optical flow, parallel plane homology) and computational geometry techniques (e.g. Voronoi tessellation) to analyse the movement behaviour of fish in response to experimental stimuli. A behavioural experiment, which involved tracking a fish’s trajectory through a field of obstacles, motivated our development of a set of tools that: (1) measure an animal’s trajectory, (2) record obstacle position, and (3) detect when the fish passed through ‘virtual gates’ between adjacent obstacles and/or the aquarium wall. We have introduced a novel Detect+Track approach that significantly enhances the accuracy and robustness of animal tracking, overcoming some of the limitations of existing tools and providing a more reliable solution for complex experimental conditions. Our workflow is divided into several discrete steps, and provides a set of modular software building blocks that can be adapted to analyse other experimental designs. A detailed tutorial is provided, together with all the data and code required to reproduce our results.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
The version now compares the accuracy and robustness of our approach to two other similar software tools.