TY - JOUR T1 - Automated segmentation of insect anatomy from micro-CT images using deep learning JF - bioRxiv DO - 10.1101/2021.05.29.446283 SP - 2021.05.29.446283 AU - Evropi Toulkeridou AU - Carlos Enrique Gutierrez AU - Daniel Baum AU - Kenji Doya AU - Evan P. Economo Y1 - 2021/01/01 UR - http://biorxiv.org/content/early/2021/05/29/2021.05.29.446283.abstract N2 - Three-dimensional (3D) imaging, such as micro-computed tomography (micro-CT), is increasingly being used by organismal biologists for precise and comprehensive anatomical characterization. However, the segmentation of anatomical structures remains a bottleneck in research, often requiring tedious manual work. Here, we propose a pipeline for the fully-automated segmentation of anatomical structures in micro-CT images utilizing state-of-the-art deep learning methods, selecting the ant brain as a test case. We implemented the U-Net architecture for 2D image segmentation for our convolutional neural network (CNN), combined with pixel-island detection. For training and validation of the network, we assembled a dataset of semi-manually segmented brain images of 94 ant species. The trained network predicted the brain area in ant images fast and accurately; its performance tested on validation sets showed good agreement between the prediction and the target, scoring 80% Intersection over Union (IoU) and 90% Dice Coefficient (F1) accuracy. While manual segmentation usually takes many hours for each brain, the trained network takes only a few minutes. Furthermore, our network is generalizable for segmenting the whole neural system in full-body scans, and works in tests on distantly related and morphologically divergent insects (e.g., fruit flies). The latter suggest that methods like the one presented here generally apply across diverse taxa. Our method makes the construction of segmented maps and the morphological quantification of different species more efficient and scalable to large datasets, a step toward a big data approach to organismal anatomy.Competing Interest StatementThe authors have declared no competing interest. ER -