TY - JOUR T1 - Plant detection and counting from high-resolution RGB images acquired from UAVs: comparison between deep-learning and handcrafted methods with application to maize, sugar beet, and sunflower JF - bioRxiv DO - 10.1101/2021.04.27.441631 SP - 2021.04.27.441631 AU - Etienne David AU - Gaëtan Daubige AU - François Joudelat AU - Philippe Burger AU - Alexis Comar AU - Benoit de Solan AU - Frédéric Baret Y1 - 2022/01/01 UR - http://biorxiv.org/content/early/2022/03/25/2021.04.27.441631.abstract N2 - Progresses in agronomy rely on accurate measurement of the experimentations conducted to improve the yield component. Measurement of the plant density is required for a number of applications since it drives part of the crop fate. The standard manual measurements in the field could be efficiently replaced by high-throughput techniques based on high-spatial resolution images taken from UAVs. This study compares several automated detection of individual plants in the images from which the plant density can be estimated. It is based on a large dataset of high resolution Red/Green/Blue (RGB) images acquired from Unmanned Aerial Vehicules (UAVs) during several years and experiments over maize, sugar beet and sunflower crops at early stages. A total of 16247 plants have been labelled interactively on the images. Performances of handcrafted method (HC) were compared to those of deep learning (DL). The HC method consists in segmenting the image into green and background pixels, identifying rows, then objects corresponding to plants thanks to knowledge of the sowing pattern as prior information. The DL method is based on the Faster Region with Convolutional Neural Network (Faster RCNN) model trained over 2/3 of the images selected to represent a good balance between plant development stage and sessions. One model is trained for each crop.Results show that simple DL methods generally outperforms simple HC, particularly for maize and sunflower crops. A significant level of variability of plant detection performances is observed between the several experiments. This was explained by the variability of image acquisition conditions including illumination, plant development stage, background complexity and weed infestation. The image quality determines part of the performances for HC methods which makes the segmentation step more difficult. Performances of DL methods are limited mainly by the presence of weeds. A hybrid method (HY) was proposed to eliminate weeds between the rows using the rules developed for the HC method. HY improves slightly DL performances in the case of high weed infestation. When few images corresponding to the conditions of the testing dataset were complementing the training dataset for DL, a drastic increase of performances for all the crops is observed, with relative RMSE below 5% for the estimation of the plant density.Competing Interest StatementThe authors have declared no competing interest. ER -