Abstract
Event cameras are a paradigm shift in camera technology. Instead of full frames, the sensor captures a sparse set of events caused by intensity changes. Since only the changes are transferred, those cameras are able to capture quick movements of objects in the scene or of the camera itself. In this work we propose a novel method to perform camera tracking of event cameras in a panoramic setting with three degrees of freedom. We propose a direct camera tracking formulation, similar to state-of-the-art in visual odometry. We show that the minimal information needed for simultaneous tracking and mapping is the spatial position of events, without using the appearance of the imaged scene point. We verify the robustness to fast camera movements and dynamic objects in the scene on a recently proposed dataset [18] and self-recorded sequences.
Original language | English |
---|---|
Title of host publication | 2017 IEEE International Conference on Computational Photography, ICCP 2017 - Proceedings |
Publisher | Institute of Electrical and Electronics Engineers |
ISBN (Electronic) | 9781509057450 |
DOIs | |
Publication status | Published - 16 Jun 2017 |
Event | 2017 IEEE International Conference on Computational Photography, ICCP 2017: ICCP 2017 - Stanford, United States Duration: 12 May 2017 → 14 May 2017 |
Conference
Conference | 2017 IEEE International Conference on Computational Photography, ICCP 2017 |
---|---|
Abbreviated title | ICCP 2017 |
Country/Territory | United States |
City | Stanford |
Period | 12/05/17 → 14/05/17 |
ASJC Scopus subject areas
- Instrumentation
- Atomic and Molecular Physics, and Optics
- Computational Theory and Mathematics
- Computer Vision and Pattern Recognition