Mapping, navigation, and learning for off‐road traversal
Citations Over TimeTop 1% of 2008 papers
Abstract
Abstract The challenge in the DARPA Learning Applied to Ground Robots (LAGR) project is to autonomously navigate a small robot using stereo vision as the main sensor. During this project, we demonstrated a complete autonomous system for off‐road navigation in unstructured environments, using stereo vision as the main sensor. The system is very robust—we can typically give it a goal position several hundred meters away and expect it to get there. In this paper we describe the main components that comprise the system, including stereo processing, obstacle and free space interpretation, long‐range perception, online terrain traversability learning, visual odometry, map registration, planning, and control. At the end of 3 years, the system we developed outperformed all nine other teams in final blind tests over previously unseen terrain. © 2008 Wiley Periodicals, Inc.
Related Papers
- → Monocular Visual Odometry Using Template Matching and IMU(2021)16 cited
- → Learning the odometry on a small humanoid robot(2016)7 cited
- → Experimental Comparison of Odometry Approaches(2013)4 cited
- → A Review on Visual Odometry Techniques for Mobile Robots: Types and Challenges(2019)1 cited
- → Real-world Comparison of Visual Odometry Methods(2020)2 cited