News

Drone monitoring of insect traps

article_published_on_label
April 20, 2020

In a collaboration with the Zurich University of Applied Sciences and the University of Aberdeen, Peter Roosjen and his colleagues of the Unmanned Aerial Remote Sensing Facility (UARSF) of WUR studied the potential of UAVs for monitoring of insect traps.

By combining deep leaning and object detection, we were able to detect and count fruit flies (specifically Drosophila suzukii) in images of sticky traps that were collected by a UAV. Besides detecting D. suzukii flies among other insects that where present in the insect traps, the detection model was also able to make a separation between the sex of the trapped flies: male flies with their distinguishable black dots on their wings were separated relatively well by the model from females, who do not have these dots. The results of this study are encouraging for the development of autonomous UAV-based monitoring systems in the future. Not just specifically for D. suzukii, but also for other insects. The insect counts can be integrated in decision support systems and can form a vital component of integrated pest management strategies.
The results of the study are published in a paper entitled ‘Deep learning for automated detection of Drosophila suzukii–Potential for UAV‐based monitoring’ in the journal Pest Management Science.

doi:10.1002/ps.5845

Abstract:
The fruit fly Drosophila suzukii, or spotted wing Drosophila (SWD), has become a serious pest world‐wide attacking many soft‐skinned fruits. An efficient monitoring system that identifies and counts SWD in crops and their surroundings is therefore essential for integrated pest management (IPM) strategies. Existing methods, such as catching them in liquid bait traps and manually counting them, are costly, time‐consuming, and labour‐intensive. To overcome these limitations, we studied insect trap monitoring through image‐based object detection with deep learning. Based on an image database with 4753 annotated SWD flies, we trained a ResNet‐18‐based deep convolutional neural network to detect and count them, including sex prediction and discrimination. The results show that SWD can be detected with an area under the precision recall curve (AUC) of 0.506 (female) and 0.603 (male), in digital images taken from a static position. For images collected by a flying unmanned aerial vehicle (UAV), the algorithm detected SWD individuals with an AUC of 0.086 (female) and 0.284 (male). The lower AUC for the aerial imagery was caused by lower image quality as a result of the stabilization manoeuvres of the UAV during image collection. Our results indicate that it is possible to monitor SWD with deep learning and object detection. Moreover, they demonstrate the potential for UAVs to monitor insect traps which could be valuable for the development of autonomous insect monitoring systems and IPM.