1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry
Article first published online: 24 MAY 2010
Copyright © 2010 Wiley Periodicals, Inc.
Journal of Field Robotics
Special Issue: Visual Mapping and Navigation Outdoors
Volume 27, Issue 5, pages 609–631, September/October 2010
How to Cite
Civera, J., Grasa, O. G., Davison, A. J. and Montiel, J. M. M. (2010), 1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry. J. Field Robotics, 27: 609–631. doi: 10.1002/rob.20345
- Issue published online: 3 AUG 2010
- Article first published online: 24 MAY 2010
- Manuscript Accepted: 24 APR 2010
- Manuscript Received: 26 OCT 2009
Random sample consensus (RANSAC) has become one of the most successful techniques for robust estimation from a data set that may contain outliers. It works by constructing model hypotheses from random minimal data subsets and evaluating their validity from the support of the whole data. In this paper we present a novel combination of RANSAC plus extended Kalman filter (EKF) that uses the available prior probabilistic information from the EKF in the RANSAC model hypothesize stage. This allows the minimal sample size to be reduced to one, resulting in large computational savings without the loss of discriminative power. 1-Point RANSAC is shown to outperform both in accuracy and computational cost the joint compatibility branch and bound (JCBB) algorithm, a gold-standard technique for spurious rejection within the EKF framework. Two visual estimation scenarios are used in the experiments: first, six-degree-of-freedom (DOF) motion estimation from a monocular sequence (structure from motion). Here, a new method for benchmarking six-DOF visual estimation algorithms based on the use of high-resolution images is presented, validated, and used to show the superiority of 1-point RANSAC. Second, we demonstrate long-term robot trajectory estimation combining monocular vision and wheel odometry (visual odometry). Here, a comparison against global positioning system shows an accuracy comparable to state-of-the-art visual odometry methods. © 2010 Wiley Periodicals, Inc.