ABSTRACT
Crowd-sourced biodiversity databases provide easy access to data and images for ecological education and research. One concern with using publicly sourced databases; however, is the quality of their images, taxonomic descriptions, and geographical metadata. To address this concern and allow researchers and educators to make informed decisions about using crowd-sourced data, I developed a suite of pipelines to evaluate taxonomic consistency, how well geo-tagging fits known distributions, and image quality of crowd-sourced biodiversity data of the order Araneae (spiders) from iNaturalist. This pipeline allows users to analyze multiple images from iNaturalist and their associated metadata; to determine the level of taxonomic identification (family, genera, species) for each occurrence; whether the taxonomy label for an image matches accepted nesting of families, genera, and species; and whether geo-tags match the distribution of the taxon described using occurrence data from the Global Biodiversity Infrastructure Facility (GBIF) as a reference. Additionally, I assessed image quality with the MatLab algorithm, BRISQUE. I used entries from the order Araneae (spiders) as a case study. At the time of my analyses (July 2021), I found that iNaturalist contained at least one observation for 124 of the 129 families of Araneae, and 115 families had three or more unique observations, with relatively similar quality of metadata and image quality across families. Taxonomic consistency was similar for observations identified at the genus and species level, but lower in observations with only family level identification. Observations with species level identifications had higher precision for geo-tags compared to those identified to the family or genus level and the highest image quality according to the BRISQUE scores. Overall, the results suggest that iNaturalist can provide large metadata and images sets for research. Given the inevitability of some low-quality observations, this pipeline provides a valuable resource for researchers and educators to evaluate the quality of iNaturalist and other crowd-sourced data.
Competing Interest Statement
The authors have declared no competing interest.