Point of Interest Detection and Visual Distance Estimation for Sensor-Rich Video

2014 ◽  
Vol 16 (7) ◽  
pp. 1929-1941 ◽  
Author(s):  
Jia Hao ◽  
Guanfeng Wang ◽  
Beomjoo Seo ◽  
Roger Zimmermann
2010 ◽  
Author(s):  
Tamer Soliman ◽  
Alison E. Gibson ◽  
Arthur M. Glenberg

2017 ◽  
Vol 42 ◽  
pp. 42-45
Author(s):  
Alessandro Mei ◽  
Ciro Manzo ◽  
Emiliano Zampetti ◽  
Francesco Petracchini ◽  
Lucia Paciucci

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 173239-173249
Author(s):  
Sai Li ◽  
Shuchao Chen ◽  
Haojiang Li ◽  
Guangying Ruan ◽  
Shuai Ren ◽  
...  

2019 ◽  
Vol 104 ◽  
pp. 02008
Author(s):  
Alexander V. Fisunov ◽  
Maxim S. Beloyvanov ◽  
Iakov S. Korovin

This paper presents a wearable eye tracker that tracks points of interests of user at videostream showed at smartphone screen. The system consists head-mounted case for smartphone, point of interest detection algorithm, the software developed for this purposes, and Android smartphone used to show videostream, estimate point of interests at video, and log estimated data into device internal memory.


2006 ◽  
Vol 9 (2) ◽  
pp. 321-331 ◽  
Author(s):  
Harald Frenz ◽  
Markus Lappe

Visual motion is used to control direction and speed of self-motion and time-to-contact with an obstacle. In earlier work, we found that human subjects can discriminate between the distances of different visually simulated self-motions in a virtual scene. Distance indication in terms of an exocentric interval adjustment task, however, revealed linear correlation between perceived and indicated distances but with a profound distance underestimation. One possible explanation for this underestimation is the perception of visual space in virtual environments. Humans perceive visual space in natural scenes as curved, and distances are increasingly underestimated with increasing distance from the observer. Such spatial compression may also exist in our virtual environment. We therefore surveyed perceived visual space in a static virtual scene. We asked observers to compare two horizontal depth intervals, similar to experiments performed in natural space. Subjects had to indicate the size of one depth interval relative to a second interval. Our observers perceived visual space in the virtual environment as compressed, similar to the perception found in natural scenes. However, the nonlinear depth function we found can not explain the observed distance underestimation of visual simulated self-motions in the same environment.


2013 ◽  
Vol 29 (6-8) ◽  
pp. 695-705 ◽  
Author(s):  
Ran Song ◽  
Yonghuai Liu ◽  
Ralph R. Martin ◽  
Paul L. Rosin

Author(s):  
James Van Hinsbergh ◽  
Nathan Griffiths ◽  
Phillip Taylor ◽  
Alasdair Thomason ◽  
Zhou Xu ◽  
...  

Author(s):  
Marco Paracchini ◽  
Emanuele Plebani ◽  
Mehdi Ben Iche ◽  
Danilo Pietro Pau ◽  
Marco Marcon

Sign in / Sign up

Export Citation Format

Share Document