DEIO: Deep Event Inertial Odometry

1The University of Hong Kong
*Equal Contribution

Abstract

Event cameras are bio-inspired, motion-activated sensors that demonstrate great potential in handling challenging situations, such as fast motion and high-dynamic range. Despite their promise, existing event-based simultaneous localization and mapping (SLAM) approaches still face limited performance in real-world applications. On the other hand, state-of-the-art SLAM approaches that incorporate deep neural networks show impressive robustness and applicability. However, there is a lack of research on fusing learning-based event SLAM methods with IMU, which could be indispensable to push the event-based SLAM to large-scale, low-texture or complex scenarios. In this paper, we propose DEIO, the first monocular deep event-inertial odometry framework, which combines learning-based method with traditional nonlinear graph-based optimization. Specifically, we tightly integrate a trainable event-based differentiable bundle adjustment (e-DBA) with the IMU pre-integration in a patch-based co-visibility factor graph that employs keyframe-based sliding window optimization. Numerical Experiments in ten public challenge datasets demonstrate that our method can achieve superior performance compared with the image-based and event-based benchmarks.
The Cover Figure

Contributions

We design the learning-optimization-combined framework that tightly-coupled integrate trainable event-based differentiable bundle adjustment (e-DBA) with IMU pre-integration in a patch-based co-visibility factor graph that employs keyframe-based sliding window optimization.
The framework is also designed to be easily plug-and-play, with DEIO for event-IMU modalities and DVIO for image-IMU modalities.


Evaluation in Dark Flighting and Driving Scenarios

We evaluate the performance of our DEIO in night driving scenarios and indoor low-light quadrotor flights, respectively. Please note that the image is for illustration purposes only, the DEIO use the event data and IMU data as input. The estimated trajectories are matched with the ground truth pose (GNSS-INS-RTK for driving, and VICON for flighting).

Quantitative Comparison with SOTA Benchmark

BibTeX


      @article{GWPHKU:DEIO,
        title={DEIO: Deep Event Inertial Odometry},
        author={Guan, Weipeng and Lin, Fuling and Chen, Peiyu and Lu, Peng},
        journal={arXiv preprint arXiv:2411.03928},
        year={2024}
      }