TY - JOUR T1 - Strengthening the BioCompute Standard by Crowdsourcing on PrecisionFDA JF - bioRxiv DO - 10.1101/2020.11.02.365528 SP - 2020.11.02.365528 AU - Sarah H Stephens AU - Charles Hadley King AU - Sean Watford AU - Janisha Patel AU - Dennis A. Dean II AU - Soner Koc AU - Nan Xiao AU - Eric F. Donaldson AU - Elaine E. Thompson AU - Anjan Purkayastha AU - Raja Mazumder AU - Elaine Johanson AU - Jonathon Keeney Y1 - 2020/01/01 UR - http://biorxiv.org/content/early/2020/11/06/2020.11.02.365528.abstract N2 - Background The field of bioinformatics has grown at such a rapid pace that a gap in standardization exists when reporting an analysis. In response, the BioCompute project was created to standardize the type and method of information communicated when describing a bioinformatic analysis. Once the project became established, its goals shifted to broadening awareness and usage of BioCompute, and soliciting feedback from a larger audience. To address these goals, the BioCompute project collaborated with precisionFDA on a crowdsourced challenge that ran from May 2019 to October 2019. This challenge had a beginner track where participants submitted BCOs based on a pipeline of their choosing, and an advanced track where participants submitted applications supporting the creation of a BCO and verification of BCO conformance to specifications.Results In total, there were 28 submissions to the beginner track (including submissions from a bioinformatics master’s class at George Washington University) and three submissions to the advanced track. Three top performers were selected from the beginner track, while a single top performer was selected for the advanced track. In the beginner track, top performers differentiated themselves by submitting BCOs that included more than the minimally compliant content. Advanced track submissions were very impressive. They included a complete web application, a command line tool that produced a static result, and a dockerized container that automatically created the BCO as the tool was run. The ability to harmonize the correct function, a simple user experience, and the aesthetics of the tool interface differentiated the tools.Conclusions Despite being new to the concept, most beginner track scores were high, indicating that most users understood the fundamental concepts of the BCO specification. Novice bioinformatics students were an ideal cohort for this Challenge because of their lack of familiarity with BioCompute, broad diversity of research interests, and motivation to submit high-quality work. This challenge was successful in introducing the BCO to a wider audience, obtaining feedback from that audience, and resulting in a tool novices may use for BCO creation and conformance. In addition, the BCO specification itself was improved based on feedback illustrating the utility of a “wisdom of the crowd” approach to standards development.Competing Interest StatementThe authors have declared no competing interest. ER -