Event-based Vision for 6-DOF Pose Tracking and 3D Mapping

The University of Hong Kong

Abstract

      Simultaneous Localization and Mapping (SLAM) serves as a foundational technology for emerging applications, such as robotics, autonomous driving, embodied intelligence, and augmented / virtual reality. However, traditional image-based SLAM systems still struggle with reliable pose estimation and 3D reconstruction under challenging conditions involving high-speed motion and extreme illumination variations. Event cameras, also known as dynamic vision sensors, have recently emerged as a promising alternative to standard cameras for visual perception. Instead of capturing intensity images at a fixed frame rate, event cameras asynchronously measure per-pixel brightness changes, producing a stream of events that encode the time, pixel location, and sign of the brightness changes. They offer attractive advantages, including high temporal resolution (MHz-level), high dynamic range (HDR, 140 dB), low latency (microsecond), no motion blur, and low power consumption. However, integrating event cameras into SLAM systems presents significant challenges due to the fundamentally different characteristics of asynchronous event streams compared to conventional intensity images, and new paradigm shifts are required.
      This dissertation presents innovative solutions and advancements for event-based SLAM. It begins with the development of Mono-EIO, a monocular event-inertial odometry framework that tightly integrates event-corner features with IMU preintegration. These event-corner features are temporally and spatially associated using novel event-based representations with a spatial-temporal and exponential decay kernel, and are subsequently incorporated into a keyframe-based sliding window optimization framework. Mono-EIO achieves high-accuracy, real-time 6-DoF ego-motion estimation even under aggressive motion and HDR conditions. Building upon this foundation, the thesis introduces PL-EVIO, an event-based visual-inertial odometry framework that combines event cameras with standard cameras to enhance robustness. The PL-EVIO utilizes line-based event features to provide additional structural constraints in human-made environments, while point-based event and image features are effectively managed to complement each other. This framework has been successfully applied to quadrotor onboard pose feedback control, enabling complex maneuvers such as flipping and operation in low-light conditions. Additionally, the thesis includes ESVIO, the first stereo event-based visual inertial odometry framework.
      The thesis also presents DEIO, a learning-optimization-combined framework that tightly coupled fuses the learning-based event data association with the IMU measurements within graph-based optimization. To the best of our knowledge, DEIO is the first learning-based event-inertial odometry, outperforming over 20 vision-based methods across 10 challenging real-world benchmarks. Finally, the thesis proposes EVI-SAM, a full SLAM system that tackles both 6-DoF pose tracking and 3D dense mapping using a monocular event camera. Its tracking module is the first hybrid approach that integrates both direct-based and feature-based methods within an event-based framework. The mapping module, on the other hand, is the first to achieve event-based dense and textured 3D reconstruction without GPU acceleration by employing a non-learning approach. This method not only successfully recovers 3D scenes structure under aggressive motions but also demonstrates superior performance compared to image-based NeRF or RGB-D cameras. Through these contributions, this dissertation significantly advances SLAM, offering robust solutions and paving the way for future research and applications in event camera.
Demo for Monocular Event-inertial Odometry
Uniform Event-corner Feature Detection Mono-EIO in Challenging Situations
Demo for Pose Feedback Control using our Event-based VIO
Quadrotor Flip Using Our PL-EVIO Quadrotor Flight Using Our ESVIO
Demo for Event-based Hybrid Pose Tracking
Our Event-based Hybrid Pose Tracking in HDR and aggressive motion

Demo for Learning-based Event-inertial Odometry
Evaluating DEIO in Drone Flying Evaluating DEIO in Aggerssive Motion
Demo for Event-based Desne Mapping
Real-time Event-based Dense Mapping Event-based Dense Mapping under Fast Motion

Publication Lists

Event-based Vision
SLAM
Robotics
Event Camera
3D Dense Mapping
6DoF Pose Tracking
Deep Learning
Event-Inertial Odometry
Event-Visual-Inertial Odometry
Visual-Inertial Odometry
Monocular
Stereo
Drone
Autonomous Driving
Feature-based Methods
Direct-based Methods
LiDAR
Image Sensor
3D Gaussian Splatting
Sensor Fusion
Perception
Computer Vision
Paper first page

Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization (2022)

Weipeng Guan, Peng Lu

Monocular EIO Feature-based Methods Drone 6 DoF Pose Tracking
Paper first page

PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with Point and Line Features (2023)

Weipeng Guan, Peiyu Chen, Yuhan Xie, Peng Lu

Monocular EIO EVIO Feature-based Methods Drone 6 DoF Pose Tracking
Paper first page

ESVIO:Event-based Stereo Visual Inertial Odometry (2023)

Peiyu Chen, Weipeng Guan, Peng Lu

Stereo EIO EVIO Feature-based Methods 6 DoF Pose Tracking Drone
Paper first page

ECMD: An Event-Centric Multisensory Driving Dataset for SLAM (2023)

Peiyu Chen, Weipeng Guan, Feng Huang, Yihan Zhong, Weisong Wen, Li-Ta Hsu, Peng Lu

Event-based Vision Autonomous Driving
Paper first page

EVI-SAM: Robust, Real-time, Tightly-coupled Event-Visual-Inertial State Estimation and 3D Dense Mapping (2024)

Weipeng Guan, Peiyu Chen, Huibin Zhao, Yu Wang, Peng Lu

Event-based Vision Monocular EVIO Dense Mapping Feature-based Methods Direct-based Methods 6 DoF Pose Tracking
Paper first page

LVI-GS: Tightly-coupled LiDAR-Visual-Inertial SLAM using 3D Gaussian Splatting (2025)

Huibin Zhao,Weipeng Guan, Peng Lu

LiDAR 3DGS Dense Mapping
Paper first page

DEIO: Deep Event Inertial Odometry (2025)

Weipeng Guan, Fuling Lin, Peiyu Chen, Peng Lu

Event-based Vision Deep Learning Monocular EIO
Paper first page

SuperEIO: Self-Supervised Event Feature Learning for Event Inertial Odometry (2025)

Peiyu Chen, Fuling Lin, Weipeng Guan, Yi Luo, Peng Lu

Event-based Vision Deep Learning EIO

BibTeX


      @article{GuanPhDThesis,
        title={Event-based Vision for 6-DOF Pose Tracking and 3D Mapping},
        author={Guan, Weipeng},
        year={2025},
      }