Physics-Based Approaches to Visual Scene Analysis

2012 ◽  
Author(s):  
Todd Zickler
Keyword(s):  
2012 ◽  
Vol 367 (1591) ◽  
pp. 942-953 ◽  
Author(s):  
Jean-Michel Hupé ◽  
Daniel Pressnitzer

Auditory streaming and visual plaids have been used extensively to study perceptual organization in each modality. Both stimuli can produce bistable alternations between grouped (one object) and split (two objects) interpretations. They also share two peculiar features: (i) at the onset of stimulus presentation, organization starts with a systematic bias towards the grouped interpretation; (ii) this first percept has ‘inertia’; it lasts longer than the subsequent ones. As a result, the probability of forming different objects builds up over time, a landmark of both behavioural and neurophysiological data on auditory streaming. Here we show that first percept bias and inertia are independent. In plaid perception, inertia is due to a depth ordering ambiguity in the transparent (split) interpretation that makes plaid perception tristable rather than bistable: experimental manipulations removing the depth ambiguity suppressed inertia. However, the first percept bias persisted. We attempted a similar manipulation for auditory streaming by introducing level differences between streams, to bias which stream would appear in the perceptual foreground. Here both inertia and first percept bias persisted. We thus argue that the critical common feature of the onset of perceptual organization is the grouping bias, which may be related to the transition from temporally/spatially local to temporally/spatially global computation.


2010 ◽  
Vol 127 (3) ◽  
pp. 1979-1979
Author(s):  
Adam O’Donovan ◽  
Ramani Duraiswami ◽  
Dmitry Zotkin ◽  
Nail Gumerov

2017 ◽  
Vol 372 (1714) ◽  
pp. 20160105 ◽  
Author(s):  
Rosy Southwell ◽  
Anna Baumann ◽  
Cécile Gal ◽  
Nicolas Barascud ◽  
Karl Friston ◽  
...  

In this series of behavioural and electroencephalography (EEG) experiments, we investigate the extent to which repeating patterns of sounds capture attention. Work in the visual domain has revealed attentional capture by statistically predictable stimuli, consistent with predictive coding accounts which suggest that attention is drawn to sensory regularities. Here, stimuli comprised rapid sequences of tone pips, arranged in regular (REG) or random (RAND) patterns. EEG data demonstrate that the brain rapidly recognizes predictable patterns manifested as a rapid increase in responses to REG relative to RAND sequences. This increase is reminiscent of the increase in gain on neural responses to attended stimuli often seen in the neuroimaging literature, and thus consistent with the hypothesis that predictable sequences draw attention. To study potential attentional capture by auditory regularities, we used REG and RAND sequences in two different behavioural tasks designed to reveal effects of attentional capture by regularity. Overall, the pattern of results suggests that regularity does not capture attention. This article is part of the themed issue ‘Auditory and visual scene analysis’.


1971 ◽  
Vol C-20 (5) ◽  
pp. 562-569 ◽  
Author(s):  
A. Rosenfeld ◽  
M. Thurston

2017 ◽  
Vol 372 (1714) ◽  
pp. 20160099 ◽  
Author(s):  
Hirohito M. Kondo ◽  
Anouk M. van Loon ◽  
Jun-Ichiro Kawahara ◽  
Brian C. J. Moore

We perceive the world as stable and composed of discrete objects even though auditory and visual inputs are often ambiguous owing to spatial and temporal occluders and changes in the conditions of observation. This raises important questions regarding where and how ‘scene analysis’ is performed in the brain. Recent advances from both auditory and visual research suggest that the brain does not simply process the incoming scene properties. Rather, top-down processes such as attention, expectations and prior knowledge facilitate scene perception. Thus, scene analysis is linked not only with the extraction of stimulus features and formation and selection of perceptual objects, but also with selective attention, perceptual binding and awareness. This special issue covers novel advances in scene-analysis research obtained using a combination of psychophysics, computational modelling, neuroimaging and neurophysiology, and presents new empirical and theoretical approaches. For integrative understanding of scene analysis beyond and across sensory modalities, we provide a collection of 15 articles that enable comparison and integration of recent findings in auditory and visual scene analysis. This article is part of the themed issue ‘Auditory and visual scene analysis’.


2016 ◽  
Vol 38 (12) ◽  
pp. 2402-2415 ◽  
Author(s):  
Israel Dejene Gebru ◽  
Xavier Alameda-Pineda ◽  
Florence Forbes ◽  
Radu Horaud

Sign in / Sign up

Export Citation Format

Share Document