scholarly journals DSEC: A Stereo Event Camera Dataset for Driving Scenarios

Author(s):  
Mathias Gehrig ◽  
Willem Aarents ◽  
Daniel Gehrig ◽  
Davide Scaramuzza
Keyword(s):  
Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1137
Author(s):  
Ondřej Holešovský ◽  
Radoslav Škoviera ◽  
Václav Hlaváč ◽  
Roman Vítek

We compare event-cameras with fast (global shutter) frame-cameras experimentally, asking: “What is the application domain, in which an event-camera surpasses a fast frame-camera?” Surprisingly, finding the answer has been difficult. Our methodology was to test event- and frame-cameras on generic computer vision tasks where event-camera advantages should manifest. We used two methods: (1) a controlled, cheap, and easily reproducible experiment (observing a marker on a rotating disk at varying speeds); (2) selecting one challenging practical ballistic experiment (observing a flying bullet having a ground truth provided by an ultra-high-speed expensive frame-camera). The experimental results include sampling/detection rates and position estimation errors as functions of illuminance and motion speed; and the minimum pixel latency of two commercial state-of-the-art event-cameras (ATIS, DVS240). Event-cameras respond more slowly to positive than to negative large and sudden contrast changes. They outperformed a frame-camera in bandwidth efficiency in all our experiments. Both camera types provide comparable position estimation accuracy. The better event-camera was limited by pixel latency when tracking small objects, resulting in motion blur effects. Sensor bandwidth limited the event-camera in object recognition. However, future generations of event-cameras might alleviate bandwidth limitations.


2021 ◽  
Author(s):  
Zehao Chen ◽  
Qian Zheng ◽  
Peisong Niu ◽  
Huajin Tang ◽  
Gang Pan

Sensor Review ◽  
2021 ◽  
Vol 41 (4) ◽  
pp. 382-389
Author(s):  
Laura Duarte ◽  
Mohammad Safeea ◽  
Pedro Neto

Purpose This paper proposes a novel method for human hands tracking using data from an event camera. The event camera detects changes in brightness, measuring motion, with low latency, no motion blur, low power consumption and high dynamic range. Captured frames are analysed using lightweight algorithms reporting three-dimensional (3D) hand position data. The chosen pick-and-place scenario serves as an example input for collaborative human–robot interactions and in obstacle avoidance for human–robot safety applications. Design/methodology/approach Events data are pre-processed into intensity frames. The regions of interest (ROI) are defined through object edge event activity, reducing noise. ROI features are extracted for use in-depth perception. Findings Event-based tracking of human hand demonstrated feasible, in real time and at a low computational cost. The proposed ROI-finding method reduces noise from intensity images, achieving up to 89% of data reduction in relation to the original, while preserving the features. The depth estimation error in relation to ground truth (measured with wearables), measured using dynamic time warping and using a single event camera, is from 15 to 30 millimetres, depending on the plane it is measured. Originality/value Tracking of human hands in 3 D space using a single event camera data and lightweight algorithms to define ROI features (hands tracking in space).


2021 ◽  
pp. 1-1
Author(s):  
Xiao Lu ◽  
Xuning Mao ◽  
Haiqing Liu ◽  
Xiaolin Meng ◽  
Laxmisha Rai

Author(s):  
Tat-Jun Chin ◽  
Samya Bagchi ◽  
Anders Eriksson ◽  
Andre van Schaik
Keyword(s):  

Author(s):  
Cedric Scheerlinck ◽  
Henri Rebecq ◽  
Daniel Gehrig ◽  
Nick Barnes ◽  
Robert E. Mahony ◽  
...  

2021 ◽  
Author(s):  
Yuanze Wang ◽  
Chenlu Liu ◽  
Sheng Li ◽  
Tong Wang ◽  
Weiyang Lin ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document