Abstract
Current neuroscience research is often limited to testing predetermined hypotheses and post hoc analysis of already collected data. Adaptive experimental designs, in which modeling drives ongoing data collection and selects experimental manipulations, offer a promising alternative. Still, tight integration between models and data collection requires coordinating diverse hardware configurations and complex computations under real-time constraints. Here, we introduce improv, a software platform that allows users to fully integrate custom modeling, analysis, and visualization with data collection and experimental control. We demonstrate both in silico and in vivo how improv enables more efficient experimental designs for discovery and validation across various model organisms and data types. Improv can orchestrate custom real-time behavioral analyses, rapid functional typing of neural responses from large populations via calcium microscopy, and optimal visual stimulus selection. We incorporate real-time machine learning methods for dimension reduction and predictive modeling of latent neural and behavioral features. Finally, we demonstrate how improv can perform model-driven interactive imaging and simultaneous optogenetic photostimulation of visually responsive neurons in the larval zebrafish brain expressing GCaMP6s and the red-shifted opsin rsChRmine. Together, these results demonstrate the power of improv to integrate modeling with data collection and experimental control to achieve next-generation adaptive experiments.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
↵† These authors jointly supervised this work.
Additional expanded content and supplementary videos.