This is the first work splitting tracking and mapping into two separate tasks in the literature.
This paper presents a method of estimating camera pose in an unknown scene. While this has previously been attempted by adapting SLAM algorithms developed for robotic exploration, we propose a system specifically designed to track a hand-held camera in a small AR workspace. We propose to split tracking and mapping into two separate tasks, processed in parallel threads on a dual-core computer: one thread deals with the task of robustly tracking erratic hand-held motion, while the other produces a 3D map of point features from previously observed video frames. This allows the use of computationally expensive batch optimization techniques not usually associated with real-time operation: The result is a system that produces detailed maps with thousands of landmarks which can be tracked at frame-rate, with an accuracy and robustness rivaling that of state-of-the-art model-based systems.
Klein, Georg, and David Murray."Parallel tracking and mapping for small AR workspaces." 6th IEEE and ACM International Symposium on Mixed and Augmented Reality(ISMAR 2007).
Active Vision Laboratory, Department of Engineering Science, University of Oxford