EvLiDAR-Flow: Attention-Guided Fusion between Point Clouds and Events for Scene Flow Estimation

EvLiDAR-Flow: Attention-Guided Fusion between Point Clouds and Events for Scene Flow Estimation
Ankit Sonthalia, Ramy Battrawy, René Schuster, Didier Stricker
In: International Conference on Pattern Recognition Applications and Methods. International Conference on Pattern Recognition Applications and Methods (ICPRAM-2023), February 22-24, Lissabon, Portugal, Scitepress, 2023.

Abstract:
In this paper, we propose the fusion of event streams and point clouds for scene flow estimation. Bio-inspired event cameras offer significantly lower latency and higher dynamic ranges than regular RGB cameras, and are therefore appropriate for recording high-speed motions. However, events do not provide depth information, which makes them unsuitable for scene flow (3D) estimation. On the other hand, LiDAR-based approaches are well suited to scene flow estimation due to the high precision of LiDAR measurements for outdoor scenes (e.g. autonomous vehicle applications) but they fail in the presence of unstructured regions (e.g. ground surface, grass, walls, etc.). We propose our EvLiDAR-Flow, a neural network architecture equipped with an attention module for bi-directional feature fusion between an event (2D) branch and a point cloud (3D) branch. This kind of fusion helps to overcome the lack of depth information in events while enabling the LiDAR-based scene flow branch to benefit from the rich motion information encoded by events. We validate the proposed EvLiDAR-Flow by showing that it performs significantly better and is robust to the presence of ground points, in comparison to a state-of-the-art LiDAR-only scene flow estimation method.