scholarly journals Motion estimation: A biologically inspired model

2018 ◽  
Vol 150 ◽  
pp. 44-53 ◽  
Author(s):  
L. Bowns
2017 ◽  
Vol 26 (3) ◽  
pp. 1521-1535 ◽  
Author(s):  
Evangelos Sariyanidi ◽  
Hatice Gunes ◽  
Andrea Cavallaro

2008 ◽  
Vol 2008 ◽  
pp. 1-9 ◽  
Author(s):  
Guillermo Botella ◽  
Manuel Rodríguez ◽  
Antonio García ◽  
Eduardo Ros

The robustness of the human visual system recovering motion estimation in almost any visual situation is enviable, performing enormous calculation tasks continuously, robustly, efficiently, and effortlessly. There is obviously a great deal we can learn from our own visual system. Currently, there are several optical flow algorithms, although none of them deals efficiently with noise, illumination changes, second-order motion, occlusions, and so on. The main contribution of this work is the efficient implementation of a biologically inspired motion algorithm that borrows nature templates as inspiration in the design of architectures and makes use of a specific model of human visual motion perception: Multichannel Gradient Model (McGM). This novel customizable architecture of a neuromorphic robust optical flow can be constructed with FPGA or ASIC device using properties of the cortical motion pathway, constituting a useful framework for building future complex bioinspired systems running in real time with high computational complexity. This work includes the resource usage and performance data, and the comparison with actual systems. This hardware has many application fields like object recognition, navigation, or tracking in difficult environments due to its bioinspired and robustness properties.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8217
Author(s):  
Oliver W. Layton

Most algorithms for steering, obstacle avoidance, and moving object detection rely on accurate self-motion estimation, a problem animals solve in real time as they navigate through diverse environments. One biological solution leverages optic flow, the changing pattern of motion experienced on the eye during self-motion. Here I present ARTFLOW, a biologically inspired neural network that learns patterns in optic flow to encode the observer’s self-motion. The network combines the fuzzy ART unsupervised learning algorithm with a hierarchical architecture based on the primate visual system. This design affords fast, local feature learning across parallel modules in each network layer. Simulations show that the network is capable of learning stable patterns from optic flow simulating self-motion through environments of varying complexity with only one epoch of training. ARTFLOW trains substantially faster and yields self-motion estimates that are far more accurate than a comparable network that relies on Hebbian learning. I show how ARTFLOW serves as a generative model to predict the optic flow that corresponds to neural activations distributed across the network.


2009 ◽  
Vol E92-B (2) ◽  
pp. 461-472
Author(s):  
DinhTrieu DUONG ◽  
Min-Cheol HWANG ◽  
Byeong-Doo CHOI ◽  
Jun-Hyung KIM ◽  
Sung-Jea KO

Author(s):  
Shuping ZHANG ◽  
Jinjia ZHOU ◽  
Dajiang ZHOU ◽  
Shinji KIMURA ◽  
Satoshi GOTO

Sign in / Sign up

Export Citation Format

Share Document