Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization

1The University of Hong Kong
Interpolation end reference image.

Abstract

Event cameras are biologically-inspired vision sensors that capture pixel-level illumination changes instead of the intensity image at a fixed frame rate. They offer many advantages over the standard cameras, such as high dynamic range, high temporal resolution (low latency), no motion blur, etc. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for autonomous systems and robots. In this paper, we propose monocular visual-inertial odometry for event cameras based on event-corner feature detection and matching with well-designed feature management. More specifically, two different kinds of event representations based on time surface are designed to realize event-corner feature tracking (for front-end incremental estimation) and matching (for loop closure detection). Furthermore, the proposed event representations are used to set mask for detecting the event-corner feature based on the raw event-stream, which ensures the uniformly distributed and spatial consistency characteristic of the event-corner feature. Finally, a tightly coupled, graph-based optimization framework is designed to obtain high-accurate state estimation through fusing pre-integrated IMU measurements and event-corner observations. We validate quantitatively the performance of our system on different resolution event cameras: DAVIS240C (240*180, public dataset, achieve state-of-the-art), DAVIS346 (346*240, real-test), DVXplorer (640*480 real-test). Furthermore, we demonstrate qualitatively the accuracy, robustness, loop closure, and re-localization performance of our framework on different large-scale datasets, and an autonomous quadrotor flight using our Event Visual-inertial Odometry (EVIO) framework. Videos of all the evaluations are presented on the project website.

Video Demo


IROS2022 Presentation

Loading...

Event-Corner Feature Detction

We do not rely on the use of image-based corner detection but design a asynchronously detected and uniformly distributed event-corner detector from events-only data. To enforce the uniform distribution, a minimum distance (10-20 pixels for different resolution event camera) is set between two neighboring event-corner features. Meanwhile, we maintain the event-corners, where the pixel value of the TS with polarity is not equal to 128.0, to emphasize the detected event-corner features located in the strong edges rather than the too many noisy features in low texture areas.

Interpolate start reference image.

Raw Event Stream

Loading...
Interpolation end reference image.

TS with Polarity



Event-Corner Feature Tracking and Matching

tracking-matching.

(a) The raw event stream (left), the Time Surface with polarity (middle), and the normalized Time Surface without polarity (right); (b) Event-corner tracking on Time Surface with polarity; (c) Loop detection using the event-corner features and the normalized Time Surface without polarity.


Framework Overview

framework image.

The framework overview of our Mono-EIO.

Evaluation On davis240c

Loading...

Evaluation On DAVIS346

Loading...

Evaluation On Dvxplorer

Loading...

Outdoor Evaluation

Loading...

BibTeX


      @inproceedings{GWPHKU:Mono-EIO,
        title={Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization},
        author={Guan, Weipeng and Lu, Peng},
        booktitle={2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
        pages={2438-2445},
        year={2022},
        organization={IEEE}
      }