event driven
Recently Published Documents


TOTAL DOCUMENTS

2878
(FIVE YEARS 574)

H-INDEX

53
(FIVE YEARS 8)

Author(s):  
Dierli M.R. Maschio ◽  
Bruno Duarte ◽  
André E. Lazzaretti ◽  
Jean-Marc S. Lafay ◽  
Dieky Adzkiya ◽  
...  

2022 ◽  
Vol 307 ◽  
pp. 118268
Author(s):  
Peng Li ◽  
Shuang Li ◽  
Hao Yu ◽  
Jinyue Yan ◽  
Haoran Ji ◽  
...  

2022 ◽  
Author(s):  
Antoine Grimaldi ◽  
Victor Boutin ◽  
Sio-Hoi Ieng ◽  
Ryad Benosman ◽  
Laurent Perrinet

<div> <div> <div> <p>We propose a neuromimetic architecture able to perform always-on pattern recognition. To achieve this, we extended an existing event-based algorithm [1], which introduced novel spatio-temporal features as a Hierarchy Of Time-Surfaces (HOTS). Built from asynchronous events acquired by a neuromorphic camera, these time surfaces allow to code the local dynamics of a visual scene and to create an efficient event-based pattern recognition architecture. Inspired by neuroscience, we extended this method to increase its performance. Our first contribution was to add a homeostatic gain control on the activity of neurons to improve the learning of spatio-temporal patterns [2]. A second contribution is to draw an analogy between the HOTS algorithm and Spiking Neural Networks (SNN). Following that analogy, our last contribution is to modify the classification layer and remodel the offline pattern categorization method previously used into an online and event-driven one. This classifier uses the spiking output of the network to define novel time surfaces and we then perform online classification with a neuromimetic implementation of a multinomial logistic regression. Not only do these improvements increase consistently the performances of the network, they also make this event-driven pattern recognition algorithm online and bio-realistic. Results were validated on different datasets: DVS barrel [3], Poker-DVS [4] and N-MNIST [5]. We foresee to develop the SNN version of the method and to extend this fully event-driven approach to more naturalistic tasks, notably for always-on, ultra-fast object categorization. </p> </div> </div> </div>


2022 ◽  
Author(s):  
Antoine Grimaldi ◽  
Victor Boutin ◽  
Sio-Hoi Ieng ◽  
Ryad Benosman ◽  
Laurent Perrinet

<div> <div> <div> <p>We propose a neuromimetic architecture able to perform always-on pattern recognition. To achieve this, we extended an existing event-based algorithm [1], which introduced novel spatio-temporal features as a Hierarchy Of Time-Surfaces (HOTS). Built from asynchronous events acquired by a neuromorphic camera, these time surfaces allow to code the local dynamics of a visual scene and to create an efficient event-based pattern recognition architecture. Inspired by neuroscience, we extended this method to increase its performance. Our first contribution was to add a homeostatic gain control on the activity of neurons to improve the learning of spatio-temporal patterns [2]. A second contribution is to draw an analogy between the HOTS algorithm and Spiking Neural Networks (SNN). Following that analogy, our last contribution is to modify the classification layer and remodel the offline pattern categorization method previously used into an online and event-driven one. This classifier uses the spiking output of the network to define novel time surfaces and we then perform online classification with a neuromimetic implementation of a multinomial logistic regression. Not only do these improvements increase consistently the performances of the network, they also make this event-driven pattern recognition algorithm online and bio-realistic. Results were validated on different datasets: DVS barrel [3], Poker-DVS [4] and N-MNIST [5]. We foresee to develop the SNN version of the method and to extend this fully event-driven approach to more naturalistic tasks, notably for always-on, ultra-fast object categorization. </p> </div> </div> </div>


Author(s):  
Andrew Gothard ◽  
Daniel Jones ◽  
Andre Green ◽  
Michael Torrez ◽  
Alessandro Cattaneo ◽  
...  

Abstract Event-driven neuromorphic imagers have a number of attractive properties including low-power consumption, high dynamic range, the ability to detect fast events, low memory consumption and low band-width requirements. One of the biggest challenges with using event-driven imagery is that the field of event data processing is still embryonic. In contrast, decades worth of effort have been invested in the analysis of frame-based imagery. Hybrid approaches for applying established frame-based analysis techniques to event-driven imagery have been studied since event-driven imagers came into existence. However, the process for forming frames from event-driven imagery has not been studied in detail. This work presents a principled digital coded exposure approach for forming frames from event-driven imagery that is inspired by the physics exploited in a conventional camera featuring a shutter. The technique described in this work provides a fundamental tool for understanding the temporal information content that contributes to the formation of a frame from event-driven imagery data. Event-driven imagery allows for the application of arbitrary virtual digital shutter functions to form the final frame on a pixel-by-pixel basis. The proposed approach allows for the careful control of the spatio-temporal information that is captured in the frame. Furthermore, unlike a conventional physical camera, event-driven imagery can be formed into any variety of possible frames in post-processing after the data is captured. Furthermore, unlike a conventional physical camera, coded-exposure virtual shutter functions can assume arbitrary values including positive, negative, real, and complex values. The coded exposure approach also enables the ability to perform applications of industrial interest such as digital stroboscopy without any additional hardware. The ability to form frames from event-driven imagery in a principled manner opens up new possibilities in the ability to use conventional frame-based image processing techniques on event-driven imagery.


Author(s):  
Rubén J. García-Hernández ◽  
Thierry Goubier ◽  
Danijel Schorlemmer ◽  
Natalja Rakowsky ◽  
Sven Harig ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document