PT - JOURNAL ARTICLE AU - Muttenthaler, Lukas AU - Hebart, Martin N. TI - <kbd>THINGSvision</kbd>: a Python toolbox for streamlining the extraction of activations from deep neural networks AID - 10.1101/2021.03.11.434979 DP - 2021 Jan 01 TA - bioRxiv PG - 2021.03.11.434979 4099 - http://biorxiv.org/content/early/2021/03/13/2021.03.11.434979.short 4100 - http://biorxiv.org/content/early/2021/03/13/2021.03.11.434979.full AB - Over the past decade, deep neural network (DNN) models have received a lot of attention due to their near-human object classification performance and their excellent prediction of signals recorded from biological visual systems. To better understand the function of these networks and relate them to hypotheses about brain activity and behavior, researchers need to extract the activations to images across different DNN layers. The abundance of different DNN variants, however, can often be unwieldy, and the task of extracting DNN activations from different layers may be non-trivial and error-prone for someone without a strong computational background. Thus, researchers in the fields of cognitive science and computational neuroscience would benefit from a library or package that supports a user in the extraction task. THINGSvision is a new Python module that aims at closing this gap by providing a simple and unified tool for extracting layer activations for a wide range of pretrained and randomly-initialized neural network architectures, even for users with little to no programming experience. We demonstrate the general utility of THINGsvision by relating extracted DNN activations to a number of functional MRI and behavioral datasets using representational similarity analysis, which can be performed as an integral part of the toolbox. Together, THINGSvision enables researchers across diverse fields to extract features in a streamlined manner for their custom image dataset, thereby improving the ease of relating DNNs, brain activity, and behavior, and improving the reproducibility of findings in these research fields.Competing Interest StatementThe authors have declared no competing interest.