Event cameras are revolutionary sensors that work radically differently
from standard cameras. Instead of capturing intensity images at a fixed
rate, event cameras measure changes of intensity asynchronously, in the
form of a stream of events, which encode per-pixel brightness changes.
In the last few years, their outstanding properties (asynchronous
sensing, no motion blur, high dynamic range) have led to exciting vision
applications, with very low-latency and high robustness. However, these
sensors are still scarce and expensive to get, slowing down progress of
the research community.
To address this issue, we present ESIM, an efficient and accurate event
camera simulator implemented in C++ and available open source. ESIM can
simulate arbitrary camera motion in 3D scenes, while providing events,
standard images, inertial measurements, with full ground truth
information including camera pose, velocity, as well as depth and
optical flow maps. ESIM is the first and only event-camera simulator
that accurately mimics the asynchronous output of an event camera thanks
to an adaptive sampling strategy.
In addition to simulating event cameras, ESIM can also simulate standard
cameras (including accurate motion blur) and an inertial measurement
Our open-source code comes with multiple rendering engines, among which
a fast, custom renderer based on OpenGL that can work in real time, and
a photorealistic renderer based on the UnrealCV project.
ESIM was presented at the Conference on Robot Learning 2018 last week.
Further details about ESIM can be found in the paper.