Colloquium

Remote sensing applications of stereoscopic visual SLAM in an orchard

Organised by Laboratory of Geo-information Science and Remote Sensing
Date

Tue 26 April 2022 08:30 to 09:00

Venue Gaia, building number 101
Droevendaalsesteeg 3
101
6708 PB Wageningen
+31 (0) 317 - 48 17 00
Room 1

By Jurrian Doornbos

Abstract
This report compares 3D point clouds acquired from visual Simultaneous Local- ization and Mapping (SLAM) using a handheld stereoscopic camera with 3d point clouds acquired from a drone and Structure from Motion (SfM). Point clouds can be used to acquire phenotype information in an agricultural context. Phenotype information is useful for disease detection, and comparison between experimental crop strands. An orchard from Wageningen University and Research was used as the basis of comparison. The comparison was made between the differences in set-up times, data acquisition methods, resulting location accuracy and lastly, a point cloud assessment was made. In the set-up times and methodologies, visual SLAM runs in real-time, creating instant point clouds, whereas SfM requires multiple hours of post-processing. From start to finish, SfM is easier to use, and visual SLAM requires more effort to set-up beforehand. The difference between the point-clouds are two-fold. The first is that SfM results in a lot more points, which is good in showing details. However, SfM did not result in a lot of branches, whereas visual SLAM gave has more points within the branches. Due to lack of points of visual SLAM, makes the point clouds less useful for deeper analysis for phenotyping applications. Visual SLAM does not result in dense enough data point cloud data for phe- notyping applications. However, it is more than capable to be used for close-range drone systems due to the real-time nature of SLAM. These systems in turn can acquire high-resolution imagery from a short distance, which in turn does result in better imagery for use in SfM.