Detection Of Artificial Seed (I-Seed) From Aerial Imagery Using Deep Learning Models

Organised by Laboratory of Geo-information Science and Remote Sensing

Tue 26 April 2022 09:00 to 09:30

Venue Gaia, building number 101
Room 1

By Yanuar Adrian Bomantara

In the last two decades, Unmanned Aerial Vehicle (UAV) technology has been widely utilized as an aerial survey method that is fast and has a low cost. It has been used in various domains, including livestock monitoring, bird control, and object detection with deep learning. Recently, a unique system of self-deployable and biodegradable microrobots was introduced, focusing mainly on air and topsoil environmental parameters, termed the I-Seed. These robots resemble a samara (winged achene) and are only 4cm long and 1cm wide. The study of seeds is hugely beneficial because plants survive and replicate through the natural, morphological dispersal abilities of seeds, which requires no additional internal energy source. Hence, the biomimicking of the seeds can prove to be crucial for the development of environmental sensing in the future. This research focuses on detecting I-seed from UAV images in real-time scenarios. In order to read the environmental parameters from I-Seeds, it is essential to geolocalize them. For this task, UAV RGB imagery and Deep learning have been investigated. Since numerous models have experimented with small object detection over the years, the primary challenge is to implement a suitable model that can detect I-seed objects efficiently and accurately. Recently, a popular object detection model, YOLO (You Only Look Once), has been applied in many studies in a short duration due to its high and swift performance. In this research, Iseed is detected using four variants of the YOLO version 5 (YOLOv5) model; nano (YOLOv5n), small (YOLOv5s), medium (YOLOv5m), and large(YOLOv5l). The main difference between these models is the number of layers they consist of; the larger variants contain more layers, which contributes to an increase in the number of parameters and, hence, a slower inference speed. Afterward, experiments are performed on full resolution images using the most accurate model variant and Slice Aided Hyper Inference (SAHI), using standard parameters such as inference speed and mean average precision (mAP). The results depict that the YOLOv5n variant has the highest mAP 0.5 value and is far superior to all other utilized YOLOv5 variants in terms of inference speed. The YOLOv5n result is also assessed on several conditions, such as the influence of natural illumination conditions, flight altitudes, different backgrounds, and parameter tuning. After model training, it was determined that the best conditions were detection of I-Seed Blue with a flight altitude of 4m, an overcast day, and a concrete background, obtaining accuracies of 0.91, 0.90, and 0.99, respectively. YOLOv5n outperformed the other models by achieving a mAP(0.5) score of 84.6% on the validation set and 83.2% on the test set. When this model was tested on different backgrounds, it performed the worst on the grass background, achieving a mAP(0.5) score of only 0.29. On soil, it achieved a score of 0.86, and it performed the best on concrete background, with a score of 0.96. Backgrounds with items, such as plants or litter, can hinder the I-Seed from lying precisely flat, and obstructing the aerial view’s images reduces detection performance. This is a restriction on using RGB imagery from UAVs for I-Seed detection. This study can be used as a baseline for detecting I-Seeds in the tested conditions in further studies.