Using the Marginalised Particle Filter for Real-Time Visual-Inertial Sensor Fusion

Using the Marginalised Particle Filter for Real-Time Visual-Inertial Sensor Fusion
Gabriele Bleser, Didier Stricker
Proceedings of the International Symposium on Mixed and Augmented Reality IEEE International Symposium on Mixed and Augmented Reality (ISMAR-08), 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, September 15-18, Cambridge, United Kingdom

Abstract:
The use of a particle filter (PF) for camera pose estimation is an ongoing topic in the robotics and computer vision community, especially since the FastSLAM algorithm has been utilised for simultaneous localisation and mapping (SLAM) applications with a single camera. The major problem in this context consists in the poor proposal distribution of the camera pose particles obtained from the weak motion model of a camera moved freely in 3D space. While the FastSLAM 2.0 extension is one possibility to improve the proposal distribution, this paper addresses the question of how to use measurements from low-cost inertial sensors (gyroscopes and accelerometers) to compensate for the missing control information. However, the integration of inertial data requires the additional estimation of sensor biases, velocities and potentially accelerations, resulting in a state dimension, which is not manageable by a standard PF. Therefore, the contribution of this paper consists in developing a real-time capable sensor fusion strategy based upon the marginalised particle filter (MPF) framework. The performance of the proposed strategy is evaluated in combination with a marker-based tracking system and results from a comparison with previous visual-inertial fusion strategies based upon the extended Kalman filter (EKF), the standard PF and the MPF are presented.
Keywords:
real-time, sensor fusion, inertial sensors, nonlinear filtering, (marginalised) particle filter, (extended) Kalman filter, FastSLAM