Advanced tracking through efficient image processing and visual-inertial sensor fusion

Advanced tracking through efficient image processing and visual-inertial sensor fusion
Gabriele Bleser, Didier Stricker
Article

Abstract:
This article presents a new visual-inertial tracking device for augmented and virtual reality applications and addresses two fundamental issues of such systems. The first one concerns the definition and modelling of the sensor fusion problem. Much work has been conducted in this area and several models for exploiting gyroscopes and linear accelerometers have been proposed. However, the respective advantages of each model and in particular the benefits of the integration of the accelerometer data in the filter are still unclear. A comparison of different models with special investigation of the effects of using accelerometers on the tracking performance is therefore provided. The second contribution is the development of an image processing approach that does not require special landmarks but uses natural features. The solution relies on a 3D model of the scene that is used to predict the appearances of the features by rendering the model based on data from the sensor fusion algorithm. The feature localisation is robust and accurate mainly because local lighting is also estimated. The final system is evaluated with help of ground-truth and real data. High stability and accuracy are demonstrated also for large environments.
Keywords:
Feature Tracking (KLT), Kalman Filtering, inertial sensor