Abstract
Advancements in volume electron microscopy mean it is now possible to generate thousands of serial images at nanometre resolution overnight, yet the gold standard approach for data analysis remains manual segmentation by an expert microscopist, resulting in a critical research bottleneck. Although some machine learning approaches exist in this domain, we remain far from realising the aspiration of a highly accurate, yet generic, automated analysis approach, with a major obstacle being lack of sufficient high-quality ground-truth data. To address this, we developed a novel citizen science project, Etch a Cell, to enable volunteers to manually segment the nuclear envelope of HeLa cells imaged with Serial Blockface SEM. We present our approach for aggregating multiple volunteer annotations to generate a high quality consensus segmentation, and demonstrate that data produced exclusively by volunteers can be used to train a highly accurate machine learning algorithm for automatic segmentation of the nuclear envelope, which we share here, in addition to our archived benchmark data.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
* This publication has been made possible by the participation of volunteers in the Etch A Cell project. Their contributions are acknowledged at www.zooniverse.org/projects/h-spiers/etch-a-cell/about/results
https://github.com/FrancisCrickInstitute/Etch-a-Cell-Nuclear-Envelope