PT - JOURNAL ARTICLE AU - Brock Laschowski AU - William McNally AU - Alexander Wong AU - John McPhee TI - ExoNet Database: Wearable Camera Images of Human Locomotion Environments AID - 10.1101/2020.10.23.352054 DP - 2020 Jan 01 TA - bioRxiv PG - 2020.10.23.352054 4099 - http://biorxiv.org/content/early/2020/10/23/2020.10.23.352054.short 4100 - http://biorxiv.org/content/early/2020/10/23/2020.10.23.352054.full AB - Advances in computer vision and artificial intelligence are allowing researchers to develop environment recognition systems for powered lower-limb exoskeletons and prostheses. However, small-scale and private training datasets have impeded the widespread development and dissemination of image classification algorithms for classifying human walking environments. To address these limitations, we developed ExoNet - the first open-source, large-scale hierarchical database of high-resolution wearable camera images of human locomotion environments. Unparalleled in scale and diversity, ExoNet contains over 5.6 million RGB images of different indoor and outdoor real-world walking environments, which were collected using a lightweight wearable camera system throughout the summer, fall, and winter seasons. Approximately 923,000 images in ExoNet were human-annotated using a 12-class hierarchical labelling architecture. Available publicly through IEEE DataPort, ExoNet offers an unprecedented communal platform to train, develop, and compare next-generation image classification algorithms for human locomotion environment recognition. Besides the control of powered lower-limb exoskeletons and prostheses, applications of ExoNet could extend to humanoids and autonomous legged robots.Competing Interest StatementThe authors have declared no competing interest.