A4 Vertaisarvioitu artikkeli konferenssijulkaisussa

Asynchronous Corner Tracking Algorithm Based on Lifetime of Events for DAVIS Cameras




TekijätMohamed S.A.S., Yasin J.N., Haghbayan M.H., Miele A., Heikkonen J., Tenhunen H., Plosila J.

ToimittajaGeorge Bebis, Zhaozheng Yin, Edward Kim, Jan Bender, Kartic Subr, Bum Chul Kwon, Jian Zhao, Denis Kalkofen, George Baciu

Konferenssin vakiintunut nimiInternational Symposium on Visual Computing

KustantajaSpringer Science and Business Media Deutschland GmbH

Julkaisuvuosi2020

JournalLecture Notes in Computer Science

Kokoomateoksen nimiAdvances in Visual Computing

Tietokannassa oleva lehden nimiLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Vuosikerta12509

Aloitussivu530

Lopetussivu541

ISBN978-3-030-64555-7

eISBN978-3-030-64556-4

DOIhttps://doi.org/10.1007/978-3-030-64556-4_41

Rinnakkaistallenteen osoitehttps://arxiv.org/pdf/2010.15510.pdf


Tiivistelmä

Event cameras, i.e., the Dynamic and Active-pixel Vision Sensor (DAVIS) ones, capture the intensity changes in the scene and generates a stream of events in an asynchronous fashion. The output rate of such cameras can reach up to 10 million events per second in high dynamic environments. DAVIS cameras use novel vision sensors that mimic human eyes. Their attractive attributes, such as high output rate, High Dynamic Range (HDR), and high pixel bandwidth, make them an ideal solution for applications that require high-frequency tracking. Moreover, applications that operate in challenging lighting scenarios can exploit from the high HDR of event cameras, i.e., 140 dB compared to 60 dB of traditional cameras. In this paper, a novel asynchronous corner tracking method is proposed that uses both events and intensity images captured by a DAVIS camera. The Harris algorithm is used to extract features, i.e., frame-corners from keyframes, i.e., intensity images. Afterward, a matching algorithm is used to extract event-corners from the stream of events. Events are solely used to perform asynchronous tracking until the next keyframe is captured. Neighboring events, within a window size of 5 × ×

5 pixels around the event-corner, are used to calculate the velocity and direction of extracted event-corners by fitting the 2D planar using a randomized Hough transform algorithm. Experimental evaluation showed that our approach is able to update the location of the extracted corners up to 100 times during the blind time of traditional cameras, i.e., between two consecutive intensity images.


Ladattava julkaisu

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2024-26-11 at 21:49