This is the first work splitting tracking and mapping into two separate tasks in the literature.
Abstract
This paper presents a method of estimating camera pose in an unknown scene. While this has previously been attempted by adapting SLAM algorithms developed for robotic exploration, we propose a system specifically designed to track a hand-held camera in a small AR workspace. We propose to split tracking and mapping into two separate tasks, processed in parallel threads on a dual-core computer: one thread deals with the task of robustly tracking erratic hand-held motion, while the other produces a 3D map of point features from previously observed video frames. This allows the use of computationally expensive batch optimization techniques not usually associated with real-time operation: The result is a system that produces detailed maps with thousands of landmarks which can be tracked at frame-rate, with an accuracy and robustness rivaling that of state-of-the-art model-based systems.
Reference
Klein, Georg, and David Murray."Parallel tracking and mapping for small AR workspaces." 6th IEEE and ACM International Symposium on Mixed and Augmented Reality(ISMAR 2007).
Active Vision Laboratory, Department of Engineering Science, University of Oxford
Fraundorfer, Friedrich, and Davide Scaramuzza."Visual odometry: Part i: The first 30 years and fundamentals." IEEE Robotics and Automation Magazine 18.4(2011): 80-92.
Fraundorfer, Friedrich, and Davide Scaramuzza."Visual odometry: Part II: Matching, robustness, optimization, and applications." Robotics & Automation Magazine, IEEE 19.2(2012): 78-90.
[Introduction to SLAM]
Durrant-Whyte, Hugh, and Tim Bailey."Simultaneous localization and mapping: part I." IEEE Robotics & Automation Magazine 13.3(2006): 99-110.
Bailey, Tim, and Hugh Durrant-Whyte."Simultaneous localization and mapping(SLAM): Part II." IEEE Robotics & Automation Magazine 13.3(2006): 108-117.
2016-02-16