Paper
23 May 2011 Fusion of visual odometry and inertial data for enhanced real-time egomotion estimation
V. E. Perlin, D. B. Johnson, M. M. Rohde, R. E. Karlsen
Author Affiliations +
Abstract
This paper presents a real-time motion estimation module for ground vehicles based on the fusion of monocular visual odometry and low-cost inertial measurement unit data. The system features a novel algorithmic scheme enabling accurate and robust scale estimation and odometry at high speeds. Results of multiple performance characterization experiments (on rough terrain at speeds up to 20 mph and smooth roadways at speeds of up to 75 mph) are presented. The prototype system demonstrates high levels of precision (relative distance error less than 1%, and less than 0.5% on paved roads, yaw drift rate ~2 degrees per km) in multiple configurations, including various optics and vehicles. Performance limitations, including those specific to monocular vision, are analyzed and directions for further improvements are outlined.
© (2011) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
V. E. Perlin, D. B. Johnson, M. M. Rohde, and R. E. Karlsen "Fusion of visual odometry and inertial data for enhanced real-time egomotion estimation", Proc. SPIE 8045, Unmanned Systems Technology XIII, 80450K (23 May 2011); https://doi.org/10.1117/12.883499
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visualization

Cameras

Error analysis

Global Positioning System

Data fusion

Motion estimation

Roads

Back to Top