[PDF] Low-latency event-based visual odometry | Semantic Scholar (2024)

Figures from this paper

  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
  • figure 6
  • figure 7
  • figure 9

Topics

Event-based Visual Odometry (opens in a new tab)Dynamic Vision Sensor (opens in a new tab)Visual Odometry (opens in a new tab)Low Latency (opens in a new tab)Luminance Change (opens in a new tab)Asynchronous Events (opens in a new tab)Latency (opens in a new tab)Luminance (opens in a new tab)Translation (opens in a new tab)Event-based (opens in a new tab)

102 Citations

Low-latency visual odometry using event-based feature tracks
    Beat KuengElias MuegglerGuillermo GallegoD. Scaramuzza

    Computer Science, Engineering

    2016 IEEE/RSJ International Conference on…

  • 2016

This paper presents a low-latency visual odometry algorithm for the DAVIS sensor using event-based feature tracks that tightly interleaves robust pose optimization and probabilistic mapping and shows that the method successfully tracks the 6-DOF motion of the sensor in natural scenes.

  • 156
  • PDF
Event-Based Visual Inertial Odometry Using Smart Features
    Zachary P. Friedel

    Computer Science, Engineering

  • 2020

The implementation of a CNN trained to detect and describe features within an image as well as the implementation of an event- based visual-inertial odometry (EVIO) pipeline, which estimates a vehicle’s 6-degrees-offreedom (DOF) pose using an affixed event-based camera with an integrated inertial measurement unit (IMU).

  • 1
Continuous-Time Visual-Inertial Odometry for Event Cameras
    Elias MuegglerGuillermo GallegoHenri RebecqD. Scaramuzza

    Engineering, Computer Science

    IEEE Transactions on Robotics

  • 2018

This paper is the first work on visual-inertial fusion with event cameras using a continuous-time framework and shows that the method provides improved accuracy over the result of a state-of-the-art visual odometry method for event cameras.

Event-Based Visual-Inertial Odometry on a Fixed-Wing Unmanned Aerial Vehicle
    K. J. Nelson

    Engineering, Environmental Science

  • 2019

This document presents the research and implementation of an event-based visualinertial odometry (EVIO) pipeline, which estimates a vehicle’s 6-degrees-of-freedom (DOF) motion and pose utilizing an affixed event- based camera with an integrated Micro-Electro-Mechanical Systems (MEMS) inertial measurement unit (IMU).

  • 3
Event-Based Stereo Visual Odometry
    Yi ZhouGuillermo GallegoS. Shen

    Engineering, Computer Science

    IEEE Transactions on Robotics

  • 2021

The system successfully leverages the advantages of event-based cameras to perform visual odometry in challenging illumination conditions, such as low-light and high dynamic range, while running in real-time on a standard CPU.

The Event-Camera Dataset: Event-based Data for Pose Estimation, Visual Odometry, and SLAM
    Elias MuegglerHenri RebecqGuillermo GallegoT. DelbruckD. Scaramuzza

    Computer Science, Engineering

  • 2016

This paper presents and releases a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which it hopes will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computervision applications.

  • 3
  • PDF
Lifetime estimation of events from Dynamic Vision Sensors
    Elias MuegglerChristian ForsterNathan BaumliGuillermo GallegoD. Scaramuzza

    Computer Science, Engineering

    2015 IEEE International Conference on Robotics…

  • 2015

An algorithm is developed that augments each event with its lifetime, which is computed from the event's velocity on the image plane, which gives a continuous representation of events in time, hence enabling the design of new algorithms that outperform those based on the accumulation of events over fixed, artificially-chosen time intervals.

  • 105
  • PDF
Continuous-Time Trajectory Estimation for Event-based Vision Sensors
    Elias MuegglerGuillermo GallegoD. Scaramuzza

    Computer Science, Engineering

    Robotics: Science and Systems

  • 2015

This paper addresses ego-motion estimation for an event-based vision sensor using a continuous-time framework to directly integrate the information conveyed by the sensor.

  • 49
  • PDF
Event-based, 6-DOF pose tracking for high-speed maneuvers
    Elias MuegglerB. HuberD. Scaramuzza

    Computer Science, Engineering

    2014 IEEE/RSJ International Conference on…

  • 2014

This paper presents the first onboard perception system for 6-DOF localization during high-speed maneuvers using a Dynamic Vision Sensor (DVS), and provides a versatile method to capture ground-truth data using a DVS.

  • 209
  • PDF
PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with Point and Line Features
    W. GuanPei-Ying ChenYuhan XieP. Lu

    Computer Science, Engineering

    IEEE Transactions on Automation Science and…

  • 2023

This paper proposes a robust, high-accurate, and real-time optimization-based monocular event-based visual-inertial odometry (VIO) method with event-corner features, line-based event features, and point-based image features that can achieve superior performance compared with the state-of-the-art image-based or event- based VIO.

...

...

24 References

Low-latency localization by active LED markers tracking using a dynamic vision sensor
    A. CensiJonas StrubelChristian BrandliT. DelbrückD. Scaramuzza

    Computer Science, Engineering

    2013 IEEE/RSJ International Conference on…

  • 2013

A method for low-latency pose tracking using a DVS and Active Led Markers, which are LEDs blinking at high frequency (>1 KHz), which is compared to traditional pose tracking based on a CMOS camera.

  • 89
  • PDF
Asynchronous Event-Based Binocular Stereo Matching
    P. RogisterR. BenosmanS. IengP. LichtsteinerT. Delbrück

    Computer Science, Engineering

    IEEE Transactions on Neural Networks and Learning…

  • 2012

It is shown that matching on the timing of the visual events provides a new solution to the real-time computation of 3-D objects when combined with geometric constraints using the distance to the epipolar lines.

  • 144
  • PDF
Simultaneous Localization and Mapping for Event-Based Vision Systems
    David WeikersdorferRaoul HoffmannJ. Conradt

    Computer Science, Engineering

    ICVS

  • 2013

This work proposes a novel method for vision based simultaneous localization and mapping (vSLAM) using a biologically inspired vision sensor that mimics the human retina that operates on individual pixel events and generates high-quality 2D environmental maps with precise robot localizations.

  • 109
  • PDF
Event-Based Visual Flow
    R. BenosmanCharles ClercqXavier LagorceS. IengC. Bartolozzi

    Computer Science, Physics

    IEEE Transactions on Neural Networks and Learning…

  • 2014

This paper introduces a framework to estimate visual flow from the local properties of events' spatiotemporal space and shows that precise visual flow orientation and amplitude can be estimated using a local differential approach on the surface defined by coactive events.

  • 310
  • PDF
Asynchronous frameless event-based optical flow
    R. BenosmanS. IengCharles ClercqC. BartolozziM. Srinivasan

    Computer Science, Engineering

    Neural Networks

  • 2012
  • 188
Fast sensory motor control based on event-based hybrid neuromorphic-procedural system
    T. DelbrückP. Lichtsteiner

    Computer Science, Engineering

    2007 IEEE International Symposium on Circuits and…

  • 2007

A hybrid neuromorphic-procedural system consisting of an address-event silicon retina, a computer, and a servo motor can be used to implement a fast sensory-motor reactive controller to track and block balls shot at a goal.

  • 126
  • PDF
A pencil balancing robot using a pair of AER dynamic vision sensors
    J. ConradtMatthew CookR. BernerP. LichtsteinerR. DouglasT. Delbrück

    Computer Science, Engineering

    2009 IEEE International Symposium on Circuits and…

  • 2009

This demonstration shows how a pair of spike-based silicon retina dynamic vision sensors (DVS) is used to provide fast visual feedback for controlling an actuated table to balance an ordinary pencil.

  • 130
  • PDF
Event-based particle filtering for robot self-localization
    David WeikersdorferJ. Conradt

    Computer Science, Engineering

    2012 IEEE International Conference on Robotics…

  • 2012

A novel algorithm for robot self-localization using an embedded event-based sensor which only represents pixel-level illumination changes in a scene, in contrast to classical image sensors, which wastefully transmit redundant information at a much lower frame rate.

  • 67
  • PDF
A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor
    P. LichtsteinerC. PoschT. Delbrück

    Computer Science, Engineering

    IEEE Journal of Solid-State Circuits

  • 2008

This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.

  • 1,124
  • PDF
A 240x180 120dB 10mW 12us-latency sparse output vision sensor for mobile applications
    R. BernerChristian BrandliMinhao YangShih-Chii LiuT. Delbruck

    Engineering, Computer Science

  • 2013

A CMOS vision sensor that combines event-driven asychronous read out of temporal contrast with synchronous frams-based active pixel sensor (APS) readout of intensity that allows low latency at low data rate and low system-level power consumption is proposed.

  • 18

...

...

Related Papers

Showing 1 through 3 of 0 Related Papers

    [PDF] Low-latency event-based visual odometry | Semantic Scholar (2024)

    References

    Top Articles
    Latest Posts
    Article information

    Author: Delena Feil

    Last Updated:

    Views: 5839

    Rating: 4.4 / 5 (65 voted)

    Reviews: 80% of readers found this page helpful

    Author information

    Name: Delena Feil

    Birthday: 1998-08-29

    Address: 747 Lubowitz Run, Sidmouth, HI 90646-5543

    Phone: +99513241752844

    Job: Design Supervisor

    Hobby: Digital arts, Lacemaking, Air sports, Running, Scouting, Shooting, Puzzles

    Introduction: My name is Delena Feil, I am a clean, splendid, calm, fancy, jolly, bright, faithful person who loves writing and wants to share my knowledge and understanding with you.