Colloquium

Interactive Ground Truthing for Detecting Coconut Trees Based on Very High Resolution Image;A case study in the Tongatapu island of the kingdom of Tonga

Organised by Laboratory of Geo-information Science and Remote Sensing
Date

Mon 25 June 2018 09:00 to 09:30

Venue Gaia, building number 101
Droevendaalsesteeg 3
101
6708 PB Wageningen
+31 (0) 317 - 48 17 00
Room 1

By Ping Zhou (China)

Summary
Coconut trees are primary food and income source in many South Pacific ocean countries where typhoons happen often. In order to help local citizens recovering from such disasters in a timely manner, the post-disaster assessment needs to be done by comparing the number of coconut trees before and after the typhoon, thus, quantifying the number of coconut trees can be helpful. This has been usually done manually and is inefficient. My research proposes an automatic way of detecting coconut trees from a very high-resolution aerial image using machine learning. Experiments were done on optical imagery acquired in the Tongatapu island of the Kingdom of Tonga. The work is distinguished by three key contributions. Firstly, several popular image feature descriptors and classifiers were researched and tested for coconut trees detection: the dense SIFT (Scale-Invariant Feature Transform) bag of visual words features proved performed the best and the linear-SVM binary classifier obtained the highest accuracy, while requiring a very short period of time to be trained. Secondly, the traditional annotation method was compared with an interactive annotation strategy, and the result showed that the interactive way helped the user annotating more efficiently and also obtained a better detection accuracy with less training samples. Thirdly, I developed a QGIS plug-in to let the user interactively annotate objects, train model and thus detect coconut trees from an aerial image. Finally, I performed a test using the interactive annotation method on an independent aerial image, with 308 coconut trees and 307 non-coconut annotations, where the proposed system obtained 0.71 recall, 0.74 precision and 0.72 F-score.