Figures from this paper
- figure 1
- figure 2
- figure 3
- figure 4
- figure 5
- figure 6
- figure 7
- figure 9
Topics
Event-based Visual Odometry (opens in a new tab)Dynamic Vision Sensor (opens in a new tab)Visual Odometry (opens in a new tab)Low Latency (opens in a new tab)Luminance Change (opens in a new tab)Asynchronous Events (opens in a new tab)Latency (opens in a new tab)Luminance (opens in a new tab)Translation (opens in a new tab)Event-based (opens in a new tab)
102 Citations
- Beat KuengElias MuegglerGuillermo GallegoD. Scaramuzza
- 2016
Computer Science, Engineering
2016 IEEE/RSJ International Conference on…
This paper presents a low-latency visual odometry algorithm for the DAVIS sensor using event-based feature tracks that tightly interleaves robust pose optimization and probabilistic mapping and shows that the method successfully tracks the 6-DOF motion of the sensor in natural scenes.
- 156
- PDF
- Zachary P. Friedel
- 2020
Computer Science, Engineering
The implementation of a CNN trained to detect and describe features within an image as well as the implementation of an event- based visual-inertial odometry (EVIO) pipeline, which estimates a vehicle’s 6-degrees-offreedom (DOF) pose using an affixed event-based camera with an integrated inertial measurement unit (IMU).
- 1
- Elias MuegglerGuillermo GallegoHenri RebecqD. Scaramuzza
- 2018
Engineering, Computer Science
IEEE Transactions on Robotics
This paper is the first work on visual-inertial fusion with event cameras using a continuous-time framework and shows that the method provides improved accuracy over the result of a state-of-the-art visual odometry method for event cameras.
- 141 [PDF]
- K. J. Nelson
- 2019
Engineering, Environmental Science
This document presents the research and implementation of an event-based visualinertial odometry (EVIO) pipeline, which estimates a vehicle’s 6-degrees-of-freedom (DOF) motion and pose utilizing an affixed event- based camera with an integrated Micro-Electro-Mechanical Systems (MEMS) inertial measurement unit (IMU).
- 3
- Yi ZhouGuillermo GallegoS. Shen
- 2021
Engineering, Computer Science
IEEE Transactions on Robotics
The system successfully leverages the advantages of event-based cameras to perform visual odometry in challenging illumination conditions, such as low-light and high dynamic range, while running in real-time on a standard CPU.
- 123 [PDF]
- Elias MuegglerHenri RebecqGuillermo GallegoT. DelbruckD. Scaramuzza
- 2016
Computer Science, Engineering
This paper presents and releases a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which it hopes will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computervision applications.
- 3
- PDF
- Elias MuegglerChristian ForsterNathan BaumliGuillermo GallegoD. Scaramuzza
- 2015
Computer Science, Engineering
2015 IEEE International Conference on Robotics…
An algorithm is developed that augments each event with its lifetime, which is computed from the event's velocity on the image plane, which gives a continuous representation of events in time, hence enabling the design of new algorithms that outperform those based on the accumulation of events over fixed, artificially-chosen time intervals.
- 105
- PDF
- Elias MuegglerGuillermo GallegoD. Scaramuzza
- 2015
Computer Science, Engineering
Robotics: Science and Systems
This paper addresses ego-motion estimation for an event-based vision sensor using a continuous-time framework to directly integrate the information conveyed by the sensor.
- 49
- PDF
- Elias MuegglerB. HuberD. Scaramuzza
- 2014
Computer Science, Engineering
2014 IEEE/RSJ International Conference on…
This paper presents the first onboard perception system for 6-DOF localization during high-speed maneuvers using a Dynamic Vision Sensor (DVS), and provides a versatile method to capture ground-truth data using a DVS.
- 209
- PDF
- W. GuanPei-Ying ChenYuhan XieP. Lu
- 2023
Computer Science, Engineering
IEEE Transactions on Automation Science and…
This paper proposes a robust, high-accurate, and real-time optimization-based monocular event-based visual-inertial odometry (VIO) method with event-corner features, line-based event features, and point-based image features that can achieve superior performance compared with the state-of-the-art image-based or event- based VIO.
- 15 [PDF]
...
...
24 References
- A. CensiJonas StrubelChristian BrandliT. DelbrückD. Scaramuzza
- 2013
Computer Science, Engineering
2013 IEEE/RSJ International Conference on…
A method for low-latency pose tracking using a DVS and Active Led Markers, which are LEDs blinking at high frequency (>1 KHz), which is compared to traditional pose tracking based on a CMOS camera.
- 89
- PDF
- P. RogisterR. BenosmanS. IengP. LichtsteinerT. Delbrück
- 2012
Computer Science, Engineering
IEEE Transactions on Neural Networks and Learning…
It is shown that matching on the timing of the visual events provides a new solution to the real-time computation of 3-D objects when combined with geometric constraints using the distance to the epipolar lines.
- 144
- PDF
- David WeikersdorferRaoul HoffmannJ. Conradt
- 2013
Computer Science, Engineering
ICVS
This work proposes a novel method for vision based simultaneous localization and mapping (vSLAM) using a biologically inspired vision sensor that mimics the human retina that operates on individual pixel events and generates high-quality 2D environmental maps with precise robot localizations.
- 109
- PDF
- R. BenosmanCharles ClercqXavier LagorceS. IengC. Bartolozzi
- 2014
Computer Science, Physics
IEEE Transactions on Neural Networks and Learning…
This paper introduces a framework to estimate visual flow from the local properties of events' spatiotemporal space and shows that precise visual flow orientation and amplitude can be estimated using a local differential approach on the surface defined by coactive events.
- 310
- PDF
- R. BenosmanS. IengCharles ClercqC. BartolozziM. Srinivasan
- 2012
Computer Science, Engineering
Neural Networks
- 188
- T. DelbrückP. Lichtsteiner
- 2007
Computer Science, Engineering
2007 IEEE International Symposium on Circuits and…
A hybrid neuromorphic-procedural system consisting of an address-event silicon retina, a computer, and a servo motor can be used to implement a fast sensory-motor reactive controller to track and block balls shot at a goal.
- 126
- PDF
- J. ConradtMatthew CookR. BernerP. LichtsteinerR. DouglasT. Delbrück
- 2009
Computer Science, Engineering
2009 IEEE International Symposium on Circuits and…
This demonstration shows how a pair of spike-based silicon retina dynamic vision sensors (DVS) is used to provide fast visual feedback for controlling an actuated table to balance an ordinary pencil.
- 130
- PDF
- David WeikersdorferJ. Conradt
- 2012
Computer Science, Engineering
2012 IEEE International Conference on Robotics…
A novel algorithm for robot self-localization using an embedded event-based sensor which only represents pixel-level illumination changes in a scene, in contrast to classical image sensors, which wastefully transmit redundant information at a much lower frame rate.
- 67
- PDF
- P. LichtsteinerC. PoschT. Delbrück
- 2008
Computer Science, Engineering
IEEE Journal of Solid-State Circuits
This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.
- 1,124
- PDF
- R. BernerChristian BrandliMinhao YangShih-Chii LiuT. Delbruck
- 2013
Engineering, Computer Science
A CMOS vision sensor that combines event-driven asychronous read out of temporal contrast with synchronous frams-based active pixel sensor (APS) readout of intensity that allows low latency at low data rate and low system-level power consumption is proposed.
- 18
...
...
Related Papers
Showing 1 through 3 of 0 Related Papers