PT - JOURNAL ARTICLE AU - Oscar Esteban AU - Ross W Blair AU - Dylan M Nielson AU - Jan C Varada AU - Sean Marrett AU - Adam G Thomas AU - Russell A Poldrack AU - Krzysztof J Gorgolewski TI - Crowdsourced MRI quality metrics and expert quality annotations for training of humans and machines AID - 10.1101/420984 DP - 2018 Jan 01 TA - bioRxiv PG - 420984 4099 - http://biorxiv.org/content/early/2018/09/18/420984.short 4100 - http://biorxiv.org/content/early/2018/09/18/420984.full AB - The neuroimaging community is steering towards increasingly large sample sizes, which are highly heterogeneous because they can only be acquired by multi-site consortia. The visual assessment of every imaging scan is a necessary quality control step, yet arduous and time-consuming. A sizeable body of evidence shows that images of low quality are a source of variability that may be comparable to the effect size under study. We present the MRIQC WebAPI, an open crowdsourced database that collects image quality metrics extracted from MR images and corresponding manual assessments by experts. The database is rapidly growing, and currently contains over 100,000 records of image quality metrics of functional and anatomical MRIs of the human brain, and over 200 expert ratings. The resource is particularly designed for researchers to share image quality metrics and annotations that can readily be reused in training human experts and machine learning algorithms. The ultimate goal of the MRIQC WebAPI is to allow the development of fully automated quality control tools that outperform expert ratings in identifying subpar images.