Abstract
Objective Close-loop control of brain and behavior will benefit from real-time detection of behavioral events to enable low-latency communication with peripheral devices. In animal experiments, this is typically achieved by using sparsely distributed (embedded) sensors that detect animal presence in select regions of interest. High-speed cameras provide high-density sampling across large arenas, capturing the richness of animal behavior, however the image processing bottleneck prohibits real-time feedback in the context of rapidly evolving behaviors.
Approach Here we developed an open-source software, named PolyTouch, to continuously track animal behavior in large arenas and provide rapid close-loop feedback in ~1ms, ie. flight-time including (auditory) stimulus delivery. This stand-alone, cross-platform software is written in JAVA. The included wrapper for MATLAB provides experimental flexibility for data acquisition, analysis and visualization.
Main results As a proof-of-principle application we deployed the PolyTouch for place awareness training. A user defined portion of the arena was used as a virtual target; visit (or approach) to the target triggered auditory feedback. We show that mice develop awareness to virtual spaces, tend to stay shorter and move faster when they reside in the virtual target zone, if their visits are coupled to relatively high stimulus intensity (≥49dB). Thus, close-loop presentation of perceived aversive feedback is sufficient to condition mice to avoid virtual targets within the span of a single session (~20min).
Significance Neuromodulation techniques now allow control of neural activity in a cell-type specific manner in spiking resolution. Using animal behavior to drive close-loop control of neural activity would help to address the neural basis of behavioral state and environmental context dependent information processing in the brain. Because PolyTouch enables neural feedback faster than sensory signals from the periphery take to reach central circuits, it could also be used to control sensory representations in real-time.