[robotics-worldwide] [software] Release of EKLT: Event-based KLT feature tracker for Event Cameras

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[robotics-worldwide] [software] Release of EKLT: Event-based KLT feature tracker for Event Cameras

Davide Scaramuzza
Dear Colleagues,

We are excited to announce the code release of EKLT, our event-based
adaptation of the popular KLT feature tracker to event cameras. The
method is described in detail in our IJCV19 paper: "EKLT: Asynchronous,
Photometric Feature Tracking using Events and Frames". The code,
implemented in C++ can be found here:
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_uzh-2Drpg_rpg-5Feklt&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=jL710auIPxoYISxhkyi0mpeEQDfi363K_f_TMQYNL_0&e=  While the evaluation code can be
found here: https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_uzh-2Drpg_rpg-5Ffeature-5Ftracking-5Fanalysis&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=FC6hUX61S5_DSSBOCjpbzPMQTx6dg5k8RKGf9EpLa20&e= 

EKLT leverages the complementary of event cameras and standard cameras
to track visual features with low latency in the blind time between two
frames. Event cameras are novel sensors that output pixel-level
brightness changes, called "events". They offer significant advantages
over standard cameras, namely a very high dynamic range, no motion blur,
and a latency in the order of microseconds.

By leveraging the generative event model EKLT is the first method that
combines raw intensity measurements and events directly and in a
principled manner, based on a generative event model within a
maximum-likelihood framework. As a result, our method produces feature
tracks that are more accurate than the state of the art, across a wide
variety of scenes.

PDF: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_docs_IJCV19-5FGehrig.pdf&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=s5H65FWzE_bys77mCZQ8_WQmLLd6AW3_ANR38SDb-j0&e= 
YouTube: https://urldefense.proofpoint.com/v2/url?u=https-3A__youtu.be_ZyD1YPW1h4U&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=sD5BfZkDQubKkYCB5MzrRtL_U7Tp9kltvX5YYh0L2tc&e= 

Best regards

Daniel Gehrig, Henri Rebecq, Guillermo Gallego, Davide Scaramuzza

--
___________________________________

Prof. Dr. Davide Scaramuzza
Director of the Robotics and Perception Group: https://urldefense.proofpoint.com/v2/url?u=http-3A__rpg.ifi.uzh.ch_people-5Fscaramuzza.html&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=6eXvsrk8sUHyOadGApMjHKDX8LY3LmxyIZYFyy6Ipw0&e= 
Inst. of Informatics, University of Zurich,
Inst. of Neuroinformatics, University of Zurich and ETH Zurich
Andreasstrasse 15, AND 2.10, Zurich, Switzerland
Office: +41 44 635 24 09
YouTube Channel: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.youtube.com_ailabRPG_&d=DwICaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=-JMhtNFaLUA1BYn7KCeHw3jvXyEw5NWqaIDM8KxbnRk&s=DKWfVRKwgfiqrf717yb6qTEXFCN3mrpuSPZeBXtQVoo&e= 
___________________________________

_______________________________________________
robotics-worldwide mailing list
[hidden email]
http://duerer.usc.edu/mailman/listinfo.cgi/robotics-worldwide