Haptic Underestimation of Angular Extent

Perception ◽  
1998 ◽  
Vol 27 (6) ◽  
pp. 737-754 ◽  
Author(s):  
Stephen Lakatos ◽  
Lawrence E Marks

To what extent can individuals accurately estimate the angle between two surfaces through touch alone, and how does tactile judgment compare to visual judgment? Subjects' ability to estimate angle size for a variety of haptic and visual stimuli was examined in a series of nine experiments. Triangular wooden blocks and raised contour outlines comprising different angles and radii of curvature at the apex were used in experiments 1 – 4 and it was found that subjects consistently underestimated angular extent relative to visual baselines and that the degree of underestimation was inversely related to the actual size of the angle. Angle estimates also increased with increasing radius of curvature when actual angle size was held constant. In contrast, experiments 5–8 showed that subjects did not underestimate angular extent when asked to perform a haptic – visual match to a computerized visual image; this outcome suggests that visual input may ‘recalibrate’ the haptic system's internal metric for estimating angle. The basis of this crossmodal interaction was investigated in experiment 9 by varying the nature and extent of visual cues available in haptic estimation tasks. The addition of visual-spatial cues did not significantly reduce the magnitude of haptic underestimation. The experiments as a whole indicate that haptic underestimations of angle occur in a number of different stimulus contexts, but leave open the question of exactly what type of visual information may serve to recalibrate touch in this regard.

Perception ◽  
2016 ◽  
Vol 46 (1) ◽  
pp. 6-17 ◽  
Author(s):  
N. Van der Stoep ◽  
S. Van der Stigchel ◽  
T. C. W. Nijboer ◽  
C. Spence

Multisensory integration (MSI) and exogenous spatial attention can both speedup responses to perceptual events. Recently, it has been shown that audiovisual integration at exogenously attended locations is reduced relative to unattended locations. This effect was observed at short cue-target intervals (200–250 ms). At longer intervals, however, the initial benefits of exogenous shifts of spatial attention at the cued location are often replaced by response time (RT) costs (also known as Inhibition of Return, IOR). Given these opposing cueing effects at shorter versus longer intervals, we decided to investigate whether MSI would also be affected by IOR. Uninformative exogenous visual spatial cues were presented between 350 and 450 ms prior to the onset of auditory, visual, and audiovisual targets. As expected, IOR was observed for visual targets (invalid cue RT < valid cue RT). For auditory and audiovisual targets, neither IOR nor any spatial cueing effects were observed. The amount of relative multisensory response enhancement and race model inequality violation was larger for uncued as compared with cued locations indicating that IOR reduces MSI. The results are discussed in the context of changes in unisensory signal strength at cued as compared with uncued locations.


2018 ◽  
Vol 72 (5) ◽  
pp. 1141-1154 ◽  
Author(s):  
Daniele Nardi ◽  
Brian J Anzures ◽  
Josie M Clark ◽  
Brittany V Griffith

Among the environmental stimuli that can guide navigation in space, most attention has been dedicated to visual information. The process of determining where you are and which direction you are facing (called reorientation) has been extensively examined by providing the navigator with two sources of information—typically the shape of the environment and its features—with an interest in the extent to which they are used. Similar questions with non-visual cues are lacking. Here, blindfolded sighted participants had to learn the location of a target in a real-world, circular search space. In Experiment 1, two ecologically relevant non-visual cues were provided: the slope of the floor and an array of two identical auditory landmarks. Slope successfully guided behaviour, suggesting that proprioceptive/kinesthetic access is sufficient to navigate on a slanted environment. However, despite the fact that participants could localise the auditory sources, this information was not encoded. In Experiment 2, the auditory cue was made more useful for the task because it had greater predictive value and there were no competing spatial cues. Nonetheless, again, the auditory landmark was not encoded. Finally, in Experiment 3, after being prompted, participants were able to reorient by using the auditory landmark. Overall, participants failed to spontaneously rely on the auditory cue, regardless of how informative it was.


Behaviour ◽  
1979 ◽  
Vol 70 (1-2) ◽  
pp. 1-116 ◽  
Author(s):  
I. Bossema

AbstractThe European jay (Garrulus g. glandarius) strongly depends on acorns for food. Many acorns are hoarded enabling the jay to feed upon them at times of the year in which they would otherwise be unavailable. Many of the hoarded acorns germinate and become seedlings so that jays play an important role in the dispersal of acorns and the reproduction of oaks (in this study: Quercus robur, the pedunculate oak). These mutual relationships were analysed both with wild jays in the field (province of Drente, The Netherlands) and with tame birds in confinement. Variation in the composition of the food throughout the year is described quantitatively. Acorns were the stock diet of adults in most months of the year. Leaf-eating caterpillars predominantly occurring on oak were the main food items of nestlings. Acorns formed the bulk of the food of fledglings in June. A high rate of acorn consumption in winter, spring and early summer becomes possible because individual jays hoard several thousands of acorns, mainly in October. In experiments, acorns of pedunculate oak were not preferred over equal sized acorns of sessile oak (which was not found in the study area). Acorns of pedunculate oak were strongly preferred over those of American oak and nuts of hazel and beech. Among acorns of pedunculate oak, ripe, sound, long-slim and big ones were preferred. Jays collect one or more (up to six) acorns per hoarding trip. In the latter case, the first ones are swallowed and the last one is usually carried in the bill. For swallowing the dimensions of the beak imposed a limit on size preference; for bill transport usually the biggest acorn was selected. The greater the number of acorns per trip, the longer was the transportation distance during hoarding. From trip to trip jays dispersed their acorns widely and when several acorns were transported during one trip, these were generally buried at different sites. Burial took place by pushing acorns in the soil and by subsequent hammering and covering. Jays often selected rather open sites, transitions in the vegetation and vertical structures such as saplings and tree trunks, for burial of acorns. In captivity jays also hoarded surplus food. Here, spacing out of burials was also observed; previously used sites usually being avoided. In addition, hiding along substrate edges and near conspicuous objects was observed. Jays tended to hide near sticks presented in a horizontal position rather than near identical ones in vertical position, especially when the colour of the sticks contrasted with the colour of the substrate. Also, rough surfaced substrate was strongly preferred over similar but smooth surfaced substrate. Successful retrieval of and feeding on hoarded acorns were observed in winter even when snow-cover had considerably altered the scenery. No evidence was obtained that acorns could be traced back by smell. Many indications were obtained that visual information from near and far beacons, memorized during hiding, was used in finding acorns. The use of beacons by captive jays was also studied. Experiments led to the conclusion that vertical beacons are more important to retrieving birds than identical horizontal ones. The discrepancy with the jay's preference for horizontal structures during hiding is discussed. Most seedlings emerge in May and June. The distribution pattern of seedlings and bill prints on the shells of their acorns indicated that many seedlings emerged from acorns hidden by jays in the previous autumn. The cotyledons of these plants remain underground and are in excellent condition in spring and early summer. Jays exploited acorns by pulling at the stem of seedlings and then removing the cotyledons. This did not usually damage the plants severely. Jays can find acorns in this situation partly because they remember where they buried acorns. In addition, it was shown that jays select seedlings of oak rather than ones of other species, and that they preferentially inspected those seedlings that were most profitable in terms of cotyledon yield and quality. Experiments uncovered some of the visual cues used in this discrimination. The effects of hoarding on the preservation of acorns were examined in the field and the laboratory. Being buried reduced the chance that acorns were robbed by conspecifics and other acorn feeders. Scatter hoarding did not lead to better protection of buried acorns than larder hoarding, but the spread of risk was better in the former than the latter. It was concluded that the way in which jays hoard acorns increases the chance that they can exploit them later. In addition, the condition of acorns is better preserved by being buried. An analysis was made of the consequences of the jay's behaviour for oaks. The oak does incur certain costs: some of its acorns are eaten by jays during the dispersal and storage phase, and some seedlings are damaged as a consequence of cotyledon removal. However, these costs are outweighed by the benefits the oak receives. Many of its most viable acorns are widely dispersed and buried at sites where the prospects for further development into mature oak are highly favourable. The adaptiveness of the characters involved in preferential feeding on and hoarding of acorns by jays is discussed in relation to several environmental pressures: competition with allied species; food fluctuations in the jay's niche; and food competitors better equipped to break up hard "dry" fruits. Reversely, jays exert several selective pressures which are likely to have evolutionary consequences for oaks, such as the selection of long-slim and large acorns with tight shells. In addition, oak seedlings with a long tap root and tough stem are selected for. Although other factors than mutual selective pressures between the two may have affected the present day fit between jays and oaks it is concluded that several characters of jays and oaks can be considered as co-adapted features of a symbiotic relationship.


2018 ◽  
Vol 40 (1) ◽  
pp. 93-109
Author(s):  
YI ZHENG ◽  
ARTHUR G. SAMUEL

AbstractIt has been documented that lipreading facilitates the understanding of difficult speech, such as noisy speech and time-compressed speech. However, relatively little work has addressed the role of visual information in perceiving accented speech, another type of difficult speech. In this study, we specifically focus on accented word recognition. One hundred forty-two native English speakers made lexical decision judgments on English words or nonwords produced by speakers with Mandarin Chinese accents. The stimuli were presented as either as videos that were of a relatively far speaker or as videos in which we zoomed in on the speaker’s head. Consistent with studies of degraded speech, listeners were more accurate at recognizing accented words when they saw lip movements from the closer apparent distance. The effect of apparent distance tended to be larger under nonoptimal conditions: when stimuli were nonwords than words, and when stimuli were produced by a speaker who had a relatively strong accent. However, we did not find any influence of listeners’ prior experience with Chinese accented speech, suggesting that cross-talker generalization is limited. The current study provides practical suggestions for effective communication between native and nonnative speakers: visual information is useful, and it is more useful in some circumstances than others.


1995 ◽  
Vol 74 (2) ◽  
pp. 698-712 ◽  
Author(s):  
D. L. Robinson ◽  
E. M. Bowman ◽  
C. Kertzman

1. To understand some of the contributions of parietal cortex to the dynamics of visual spatial attention, we recorded from cortical cells of monkeys performing attentional tasks. We studied 484 neurons in the intraparietal sulcus and adjacent gyral tissue of two monkeys. We measured phasic responses to peripheral visual stimuli while the monkeys attended toward or away from the stimuli or when attention was not controlled. Neurons were tested while the monkeys gazed at a spot of light (simple fixation task), actively attended to a foveal target (foveal attention task), performed a reaction time task (cued reaction time task), made saccadic eye movements to visual targets (saccade task), or responded to a repetitious peripheral target (probability task). 2. In a previous paper we demonstrated that monkeys, like humans, responded more quickly to visual targets when the targets followed briefly flashed visual cues (validly cued targets) (Bowman et al. 1993). It has been hypothesized that the cue attracts attention to its locus and results in faster reaction times (Posner 1980). In the present physiological studies, visual cues consistently excited these neurons when they were flashed in the receptive field. Such activity might signal a shift of attention. Visual targets that fell within the receptive field and that immediately followed the cue evoked relatively weak responses. This response was due to a relative refractory period. 3. Next we tested attentional processes in these tasks that were independent of the visual response to the cue. We placed the cue outside of the receptive field and the target within the receptive field. We found that 23% of these cells had a significant decrease in their firing rate to validly cued targets in their receptive fields under these conditions. Strong responses were evoked by the same target when the cue was flashed in the opposite hemifield (invalidly cued targets). Thus this group of neurons responded best when attention was directed toward the opposite hemifield. 4. For another group of parietal cells (13%) there was an enhanced response to targets in the visual receptive field when the cue was in the same hemifield. For the remaining 64% of the cells there was no significant modulation in this task. 5. The cued reaction time task involved exogenous control of attention; the sensory cue gave spatial and temporal direction to attention. We used several other tasks to test for endogenous control of attention.(ABSTRACT TRUNCATED AT 400 WORDS)


2021 ◽  
Vol 2 ◽  
Author(s):  
Thirsa Huisman ◽  
Axel Ahrens ◽  
Ewen MacDonald

To reproduce realistic audio-visual scenarios in the laboratory, Ambisonics is often used to reproduce a sound field over loudspeakers and virtual reality (VR) glasses are used to present visual information. Both technologies have been shown to be suitable for research. However, the combination of both technologies, Ambisonics and VR glasses, might affect the spatial cues for auditory localization and thus, the localization percept. Here, we investigated how VR glasses affect the localization of virtual sound sources on the horizontal plane produced using either 1st-, 3rd-, 5th- or 11th-order Ambisonics with and without visual information. Results showed that with 1st-order Ambisonics the localization error is larger than with the higher orders, while the differences across the higher orders were small. The physical presence of the VR glasses without visual information increased the perceived lateralization of the auditory stimuli by on average about 2°, especially in the right hemisphere. Presenting visual information about the environment and potential sound sources did reduce this HMD-induced shift, however it could not fully compensate for it. While the localization performance itself was affected by the Ambisonics order, there was no interaction between the Ambisonics order and the effect of the HMD. Thus, the presence of VR glasses can alter acoustic localization when using Ambisonics sound reproduction, but visual information can compensate for most of the effects. As such, most use cases for VR will be unaffected by these shifts in the perceived location of the auditory stimuli.


2017 ◽  
Vol 61 (7) ◽  
pp. 672-687 ◽  
Author(s):  
Ayellet Pelled ◽  
Tanya Zilberstein ◽  
Alona Tsirulnikov ◽  
Eran Pick ◽  
Yael Patkin ◽  
...  

The existing literature presents ambivalent evidence regarding the significance of visual cues, as opposed to textual cues, in the process of impression formation. While visual information may have a strong effect due to its vividness and immediate absorption, textual information might be more powerful due to its solid, unambiguous nature. This debate is particularly relevant in the context of online social networks, whose users share textual and visual elements. To explore our main research question, “Which elements of one’s Facebook profile have a more significant influence on impression formation of extroversion—pictures or texts?” we conducted two complementary online experiments, manipulating visual and textual cues inside and outside the context of Facebook. We then attempted to identify the relevant underlying mechanisms in impression formation. Our findings indicate that textual cues play a more dominant role online, whether via Facebook or not, supporting assertions of a new-media literacy that is text based. Additionally, we found the participants’ level of need for cognition influenced the effect such that individuals with a high need for cognition placed more emphasis on textual cues. The number of “likes” was also a significant predictor of perceptions of the individuals’ social orientation, especially when the other cues were ambiguous.


2018 ◽  
Vol 5 (2) ◽  
pp. 171785 ◽  
Author(s):  
Martin F. Strube-Bloss ◽  
Wolfgang Rössler

Flowers attract pollinating insects like honeybees by sophisticated compositions of olfactory and visual cues. Using honeybees as a model to study olfactory–visual integration at the neuronal level, we focused on mushroom body (MB) output neurons (MBON). From a neuronal circuit perspective, MBONs represent a prominent level of sensory-modality convergence in the insect brain. We established an experimental design allowing electrophysiological characterization of olfactory, visual, as well as olfactory–visual induced activation of individual MBONs. Despite the obvious convergence of olfactory and visual pathways in the MB, we found numerous unimodal MBONs. However, a substantial proportion of MBONs (32%) responded to both modalities and thus integrated olfactory–visual information across MB input layers. In these neurons, representation of the olfactory–visual compound was significantly increased compared with that of single components, suggesting an additive, but nonlinear integration. Population analyses of olfactory–visual MBONs revealed three categories: (i) olfactory, (ii) visual and (iii) olfactory–visual compound stimuli. Interestingly, no significant differentiation was apparent regarding different stimulus qualities within these categories. We conclude that encoding of stimulus quality within a modality is largely completed at the level of MB input, and information at the MB output is integrated across modalities to efficiently categorize sensory information for downstream behavioural decision processing.


Neurology ◽  
2018 ◽  
Vol 90 (11) ◽  
pp. e977-e984 ◽  
Author(s):  
Motoyasu Honma ◽  
Yuri Masaoka ◽  
Takeshi Kuroda ◽  
Akinori Futamura ◽  
Azusa Shiromaru ◽  
...  

ObjectiveTo determine whether Parkinson disease (PD) affects cross-modal function of vision and olfaction because it is known that PD impairs various cognitive functions, including olfaction.MethodsWe conducted behavioral experiments to identify the influence of PD on cross-modal function by contrasting patient performance with age-matched normal controls (NCs). We showed visual effects on the strength and preference of odor by manipulating semantic connections between picture/odorant pairs. In addition, we used brain imaging to identify the role of striatal presynaptic dopamine transporter (DaT) deficits.ResultsWe found that odor evaluation in participants with PD was unaffected by visual information, while NCs overestimated smell when sniffing odorless liquid while viewing pleasant/unpleasant visual cues. Furthermore, DaT deficit in striatum, for the posterior putamen in particular, correlated to few visual effects in participants with PD.ConclusionsThese findings suggest that PD impairs cross-modal function of vision/olfaction as a result of posterior putamen deficit. This cross-modal dysfunction may serve as the basis of a novel precursor assessment of PD.


Sign in / Sign up

Export Citation Format

Share Document