Abstract. There exist many infrared inside-out 6-DOF pose tracking configurations with cameras mounted rigidly to the environment. In such a setup, tracking is inherently impossible for IR targets inside/below/behind other opaque objects (occlusion problem). We present a solution for the integration of an additional, mobile IR tracking system to overcome this problem. The solution consists of an indirect tracking setup where the stationary cameras track the mobile cameras which in turn track the target. Accuracy problems that are inherent to such an indirect tracking setup, are tackled by an error correction mechanism based on reference points in the scene that are known to both tracking systems. An evaluation demonstrates that, in naive indirect tracking without error correction, the major source of error consists in a wrong detection of orientation of the mobile system and that this source of error can be practically eliminated by our error correction mechanisms. Keywords: Augmented Reality, indirect tracking, sensor fusion, absolute orientation problem.
MotivationOne of the currently most common tracking setups for AR and VR applications consists of an outside-in configuration with a number of infrared cameras mounted rigidly to the environment, observing a fixed volume within their midst. The camera arrangement imposes restrictions toward tracking moveable objects inside/below/behind other opaque objects in the scene. We call this the occlusion problem. It is not generally solvable by adding additional cameras to the classical outside-in setup since first, occlusions generated by scene objects cannot always be known in advance and second, the scene may offer only small and varying viewing angles to the outside, which cannot simply be covered by adding some more cameras. This is especially true for trackable objects surrounded by other objects, e.g. a tool inside a car body.Our indirect tracking approach adds an additional, mobile IR tracking system which can be placed in the scene on-the-fly such that it can see trackable objects that are hidden to the stationary cameras. The mobile setup itself is equipped with a marker so that its pose can be tracked by the stationary setup (see Figure 1(a)). Figure 1(b) shows how the proposed solution would look like in AR-stud-welding, one of our industrial scenarios where the stud-welding gun has to be tracked inside the car body so that navigational information about the next welding position can be shown on a display attached to the welding gun. This application is already in productive use but until now, it suffers from the restrictions of classical outside-in tracking described above.