ROW-SLAM: Under-Canopy Cornfield Semantic SLAM [preprint]

Preprint date

September 15, 2021

Authors

Jiacheng Yuan, Jungseok Hong (Ph.D. student), Junaed Sattar (assistant professor), Volkan Isler (professor)

Abstract

We study a semantic SLAM problem faced by a robot tasked with autonomous weeding under the corn canopy. The goal is to detect corn stalks and localize them in a global coordinate frame. This is a challenging setup for existing algorithms because there is very little space between the camera and the plants, and the camera motion is primarily restricted to be along the row. To overcome these challenges, we present a multi-camera system where a side camera (facing the plants) is used for detection whereas front and back cameras are used for motion estimation. Next, we show how semantic features in the environment (corn stalks, ground, and crop planes) can be used to develop a robust semantic SLAM solution and present results from field trials performed throughout the growing season across various cornfields.

Link to full paper

ROW-SLAM: Under-Canopy Cornfield Semantic SLAM

Keywords

robotics, agriculture

Share