Detection of artificial seeds from UAV imagery using deep learning models
Plant seeds are one of the significant examples of morphological computation in natural world with optimized features such as, passive take-off, flying, landing, and drilling, to name a few.
Taking inspiration from plant seeds’ morphology and dispersion capabilities, the EU H2020 I-Seed project aims to develop biodegradable and self-deployable soft miniaturized artificial seeds with fluorescence sensors (termed ‘I-Seeds’) for monitoring environmental parameters, such as temperature, humidity, carbon dioxide, and mercury in topsoil and the air above soil. Within the project, winged seeds such as Samara (length 4–5 cm, width 1–2 cm) are one of the selected species for biomimicking. Initially, these artificial seeds will be dispersed using UAVs, and the seeds will be spread over the target area following their natural-like pathways. The artificial seeds need to be detected and localized in real time for the subsequent activity of read-out of the sensors. For remote read-out of in situ sensor information, an active laser-induced fluorescence observation system onboard UAV will excite the I-Seeds for fluorescence emission that is the function of measured environmental parameters. One of the potential ways to geo-localize these artificial seeds is using UAV RGB imagery.
In a recent paper, Yanuar Bomantara, Hasib Mustafa, Harm Bartholomeus and Lammert Kooistra from the Laboratory of Geo-information and Remote Sensing, have shown that these artificial seeds can be effectively detected from UAV RGB imagery and evaluated the effect of different environmental conditions on image acquisition for small object detection. Yanuar and his team spread the artificial and coloured Samara seeds over an experimental field with varying land coverage including concrete, soil and grass, and acquired the UAV images in the winter of 2021 on different days with sunny and cloudy skies at flying altitudes ranging from 4m to 10m. Such an experimental design allowed a quantitative evaluation of the effect of environmental parameters in data acquisition on the detection of small seed-like objects from aerial perspective. With the aim to perform real-time object detection and geo-localization onboard UAV, Yanuar implemented and evaluated a modified one-stage detector (YOLOv5nano+SAHI) to reduce processing time and computational load for full-sized images. The developed algorithm demonstrated sufficient generalizability and the absence of overfitting in the training set.
It was found that a higher detection accuracy was attained in overcast situations due to the clarity of the photographs. The flight altitude was found to be a major determinant of the results recorded from the detection process, clearly indicating an inverse relationship between altitude and model performance. Moreover, the amount of ground objects largely determines the model performance when detecting I-Seeds, followed by the contrast between the seed and the background. In essence, this work provides a baseline for seed-like object detection from aerial imagery using a one-stage detector such that it can be implemented for the real-time detection of I-Seeds in the field during UAV flight. In near future, the developed model will be implemented on small form-factor computational unit onboard UAV to assess the real-time detection performance in comparison with offline processing.
In the last two decades, unmanned aerial vehicle (UAV) technology has been widely utilized as an aerial survey method. Recently, a unique system of self-deployable and biodegradable microrobots akin to winged achene seeds was introduced to monitor environmental parameters in the air above the soil interface, which requires geo-localization. This research focuses on detecting these artificial seed-like objects from UAV RGB images in real-time scenarios, employing the object detection algorithm YOLO (You Only Look Once). Three environmental parameters, namely, daylight condition, background type, and flying altitude, were investigated to encompass varying data acquisition situations and their influence on detection accuracy. Artificial seeds were detected using four variants of the YOLO version 5 (YOLOv5) algorithm, which were compared in terms of accuracy and speed. The most accurate model variant was used in combination with slice-aided hyper inference (SAHI) on full resolution images to evaluate the model’s performance. It was found that the YOLOv5n variant had the highest accuracy and fastest inference speed. After model training, the best conditions for detecting artificial seed-like objects were found at a flight altitude of 4 m, on an overcast day, and against a concrete background, obtaining accuracies of 0.91, 0.90, and 0.99, respectively. YOLOv5n outperformed the other models by achieving a mAP0.5 score of 84.6% on the validation set and 83.2% on the test set. This study can be used as a baseline for detecting seed-like objects under the tested conditions in future studies.