scholarly journals Multimodal Influences on Learning Walks in Desert Ants (Cataglyphis fortis)

Author(s):  
Jose Adrian Vega Vermehren ◽  
Cornelia Buehlmann ◽  
Ana Sofia David Fernandes ◽  
Paul Graham

AbstractAnts are excellent navigators taking into account multimodal sensory information as they move through the world. To be able to accurately localise the nest at the end of a foraging journey, visual cues, wind direction and also olfactory cues need to be learnt. Learning walks are performed at the start of an ant’s foraging career or when the appearance of the nest surrounding has changed. We investigated here whether the structure of such learning walks in the desert ant Cataglyphis fortis takes into account wind direction in conjunction with the learning of new visual information. Ants learnt to travel back and forth between their nest and a feeder, and we then introduced a black cylinder near their nest to induce learning walks in regular foragers. By doing this across days with different prevailing wind directions, we were able to probe how ants balance the influence of different sensory modalities. We found that (i) the ants’ outwards headings are influenced by the direction of the wind with their routes deflected in such a way that they will arrive downwind of their nest when homing, (ii) a novel object along the route induces learning walks in experienced ants and (iii) the structure of learning walks is shaped by the wind direction rather than the position of the visual cue.

2018 ◽  
Vol 5 (2) ◽  
pp. 171785 ◽  
Author(s):  
Martin F. Strube-Bloss ◽  
Wolfgang Rössler

Flowers attract pollinating insects like honeybees by sophisticated compositions of olfactory and visual cues. Using honeybees as a model to study olfactory–visual integration at the neuronal level, we focused on mushroom body (MB) output neurons (MBON). From a neuronal circuit perspective, MBONs represent a prominent level of sensory-modality convergence in the insect brain. We established an experimental design allowing electrophysiological characterization of olfactory, visual, as well as olfactory–visual induced activation of individual MBONs. Despite the obvious convergence of olfactory and visual pathways in the MB, we found numerous unimodal MBONs. However, a substantial proportion of MBONs (32%) responded to both modalities and thus integrated olfactory–visual information across MB input layers. In these neurons, representation of the olfactory–visual compound was significantly increased compared with that of single components, suggesting an additive, but nonlinear integration. Population analyses of olfactory–visual MBONs revealed three categories: (i) olfactory, (ii) visual and (iii) olfactory–visual compound stimuli. Interestingly, no significant differentiation was apparent regarding different stimulus qualities within these categories. We conclude that encoding of stimulus quality within a modality is largely completed at the level of MB input, and information at the MB output is integrated across modalities to efficiently categorize sensory information for downstream behavioural decision processing.


2000 ◽  
Vol 203 (7) ◽  
pp. 1113-1121 ◽  
Author(s):  
B. Ronacher ◽  
K. Gallizzi ◽  
S. Wohlgemuth ◽  
R. Wehner

The present account answers the question of whether desert ants (Cataglyphis fortis) gauge the distance they have travelled by using self-induced lateral optic-flow parameters, as has been described for bees. The ants were trained to run to a distant food source within a channel whose walls were covered with black-and-white gratings. From the food source, they were transferred to test channels of double or half the training width, and the distance they travelled before searching for home and their walking speeds were recorded. Since the animals experience different motion parallax cues when walking in the broader or narrower channels, the optic-flow hypothesis predicted that the ants would walk faster and further in the broader channels, but more slowly and less far in the narrower channels. In contrast to this expectation, neither the walking speeds nor the searching distances depended on the width or height of the channels or on the pattern wavelengths. Even when ventral-field visual cues were excluded by covering the eyes with light-tight paint, the ants were not influenced by lateral optic flow-field cues. Hence, walking desert ants do not depend on self-induced visual flow-field cues in gauging the distance they have travelled, as do flying honeybees, but can measure locomotor distance exclusively by idiothetic means.


2019 ◽  
Vol 16 (154) ◽  
pp. 20180903
Author(s):  
Edward D. Lee ◽  
Edward Esposito ◽  
Itai Cohen

Swing in a crew boat, a good jazz riff, a fluid conversation: these tasks require extracting sensory information about how others flow in order to mimic and respond. To determine what factors influence coordination, we build an environment to manipulate incoming sensory information by combining virtual reality and motion capture. We study how people mirror the motion of a human avatar’s arm as we occlude the avatar. We efficiently map the transition from successful mirroring to failure using Gaussian process regression. Then, we determine the change in behaviour when we introduce audio cues with a frequency proportional to the speed of the avatar’s hand or train individuals with a practice session. Remarkably, audio cues extend the range of successful mirroring to regimes where visual information is sparse. Such cues could facilitate joint coordination when navigating visually occluded environments, improve reaction speed in human–computer interfaces or measure altered physiological states and disease.


2020 ◽  
Author(s):  
Nicola Meda ◽  
Giulio M. Menti ◽  
Aram Megighian ◽  
Mauro A. Zordan

ABSTRACTAnimals rely on multiple sensory information systems to make decisions. The integration of information stemming from these systems is believed to result in a precise behavioural output. To what degree a single sensory system may override the others is unknown. Evidence for a hierarchical use of different systems to guide navigation is lacking. We used Drosophila melanogaster to investigate whether, in order to relieve an unpleasant stimulation, fruit flies employed an idiothetically-based local search strategy before making use of visual information, or viceversa. Fruit flies appear to initially resort to idiothetic information and only later, if the first strategy proves unsuccessful to relieve the unpleasant stimulation, make use of other information, such as visual cues. By leveraging on this innate preference for a hierarchical use of one strategy over another, we believe that in vivo recordings of brain activity during the navigation of fruit flies could provide mechanistic insights into how simultaneous information from multiple sensory modalities is evaluated, integrated, and motor responses elicited, thus shedding new light on the neural basis of decision-making.


2021 ◽  
pp. jeb.241968
Author(s):  
Te K. Jones ◽  
Cynthia F. Moss

Studies have shown that bats are capable of using visual information for a variety of purposes, including navigation and foraging, but the relative contributions of visual and auditory modalities in obstacle avoidance has yet to be fully investigated, particularly in laryngeal echolocating bats. A first step requires a characterization of behavioral responses to different combinations of sensory cues. Here we quantify the behavioral responses of the insectivorous big brown bat, Eptesicus fuscus, in an obstacle avoidance task offering different combinations of auditory and visual cues. To do so, we utilize a new method that eliminates the confounds typically associated with testing bat vision and precludes auditory cues. We find that the presence of visual and auditory cues together enhances bats’ avoidance response to obstacles compared to cues requiring either vision or audition alone. Analysis of flight and echolocation behaviors, such as speed and call rate, did not vary significantly under different obstacle conditions, and thus are not informative indicators of a bat's response to obstacle stimulus type. These findings advance the understanding of the relative importance of visual and auditory sensory modalities in guiding obstacle avoidance behaviors.


2000 ◽  
Vol 84 (6) ◽  
pp. 2984-2997 ◽  
Author(s):  
Per Jenmalm ◽  
Seth Dahlstedt ◽  
Roland S. Johansson

Most objects that we manipulate have curved surfaces. We have analyzed how subjects during a prototypical manipulatory task use visual and tactile sensory information for adapting fingertip actions to changes in object curvature. Subjects grasped an elongated object at one end using a precision grip and lifted it while instructed to keep it level. The principal load of the grasp was tangential torque due to the location of the center of mass of the object in relation to the horizontal grip axis joining the centers of the opposing grasp surfaces. The curvature strongly influenced the grip forces required to prevent rotational slips. Likewise the curvature influenced the rotational yield of the grasp that developed under the tangential torque load due to the viscoelastic properties of the fingertip pulps. Subjects scaled the grip forces parametrically with object curvature for grasp stability. Moreover in a curvature-dependent manner, subjects twisted the grasp around the grip axis by a radial flexion of the wrist to keep the desired object orientation despite the rotational yield. To adapt these fingertip actions to object curvature, subjects could use both vision and tactile sensibility integrated with predictive control. During combined blindfolding and digital anesthesia, however, the motor output failed to predict the consequences of the prevailing curvature. Subjects used vision to identify the curvature for efficient feedforward retrieval of grip force requirements before executing the motor commands. Digital anesthesia caused little impairment of grip force control when subjects had vision available, but the adaptation of the twist became delayed. Visual cues about the form of the grasp surface obtained before contact was used to scale the grip force, whereas the scaling of the twist depended on visual cues related to object movement. Thus subjects apparently relied on different visuomotor mechanisms for adaptation of grip force and grasp kinematics. In contrast, blindfolded subjects used tactile cues about the prevailing curvature obtained after contact with the object for feedforward adaptation of both grip force and twist. We conclude that humans use both vision and tactile sensibility for feedforward parametric adaptation of grip forces and grasp kinematics to object curvature. Normal control of the twist action, however, requires digital afferent input, and different visuomotor mechanisms support the control of the grasp twist and the grip force. This differential use of vision may have a bearing to the two-stream model of human visual processing.


Behaviour ◽  
1979 ◽  
Vol 70 (1-2) ◽  
pp. 1-116 ◽  
Author(s):  
I. Bossema

AbstractThe European jay (Garrulus g. glandarius) strongly depends on acorns for food. Many acorns are hoarded enabling the jay to feed upon them at times of the year in which they would otherwise be unavailable. Many of the hoarded acorns germinate and become seedlings so that jays play an important role in the dispersal of acorns and the reproduction of oaks (in this study: Quercus robur, the pedunculate oak). These mutual relationships were analysed both with wild jays in the field (province of Drente, The Netherlands) and with tame birds in confinement. Variation in the composition of the food throughout the year is described quantitatively. Acorns were the stock diet of adults in most months of the year. Leaf-eating caterpillars predominantly occurring on oak were the main food items of nestlings. Acorns formed the bulk of the food of fledglings in June. A high rate of acorn consumption in winter, spring and early summer becomes possible because individual jays hoard several thousands of acorns, mainly in October. In experiments, acorns of pedunculate oak were not preferred over equal sized acorns of sessile oak (which was not found in the study area). Acorns of pedunculate oak were strongly preferred over those of American oak and nuts of hazel and beech. Among acorns of pedunculate oak, ripe, sound, long-slim and big ones were preferred. Jays collect one or more (up to six) acorns per hoarding trip. In the latter case, the first ones are swallowed and the last one is usually carried in the bill. For swallowing the dimensions of the beak imposed a limit on size preference; for bill transport usually the biggest acorn was selected. The greater the number of acorns per trip, the longer was the transportation distance during hoarding. From trip to trip jays dispersed their acorns widely and when several acorns were transported during one trip, these were generally buried at different sites. Burial took place by pushing acorns in the soil and by subsequent hammering and covering. Jays often selected rather open sites, transitions in the vegetation and vertical structures such as saplings and tree trunks, for burial of acorns. In captivity jays also hoarded surplus food. Here, spacing out of burials was also observed; previously used sites usually being avoided. In addition, hiding along substrate edges and near conspicuous objects was observed. Jays tended to hide near sticks presented in a horizontal position rather than near identical ones in vertical position, especially when the colour of the sticks contrasted with the colour of the substrate. Also, rough surfaced substrate was strongly preferred over similar but smooth surfaced substrate. Successful retrieval of and feeding on hoarded acorns were observed in winter even when snow-cover had considerably altered the scenery. No evidence was obtained that acorns could be traced back by smell. Many indications were obtained that visual information from near and far beacons, memorized during hiding, was used in finding acorns. The use of beacons by captive jays was also studied. Experiments led to the conclusion that vertical beacons are more important to retrieving birds than identical horizontal ones. The discrepancy with the jay's preference for horizontal structures during hiding is discussed. Most seedlings emerge in May and June. The distribution pattern of seedlings and bill prints on the shells of their acorns indicated that many seedlings emerged from acorns hidden by jays in the previous autumn. The cotyledons of these plants remain underground and are in excellent condition in spring and early summer. Jays exploited acorns by pulling at the stem of seedlings and then removing the cotyledons. This did not usually damage the plants severely. Jays can find acorns in this situation partly because they remember where they buried acorns. In addition, it was shown that jays select seedlings of oak rather than ones of other species, and that they preferentially inspected those seedlings that were most profitable in terms of cotyledon yield and quality. Experiments uncovered some of the visual cues used in this discrimination. The effects of hoarding on the preservation of acorns were examined in the field and the laboratory. Being buried reduced the chance that acorns were robbed by conspecifics and other acorn feeders. Scatter hoarding did not lead to better protection of buried acorns than larder hoarding, but the spread of risk was better in the former than the latter. It was concluded that the way in which jays hoard acorns increases the chance that they can exploit them later. In addition, the condition of acorns is better preserved by being buried. An analysis was made of the consequences of the jay's behaviour for oaks. The oak does incur certain costs: some of its acorns are eaten by jays during the dispersal and storage phase, and some seedlings are damaged as a consequence of cotyledon removal. However, these costs are outweighed by the benefits the oak receives. Many of its most viable acorns are widely dispersed and buried at sites where the prospects for further development into mature oak are highly favourable. The adaptiveness of the characters involved in preferential feeding on and hoarding of acorns by jays is discussed in relation to several environmental pressures: competition with allied species; food fluctuations in the jay's niche; and food competitors better equipped to break up hard "dry" fruits. Reversely, jays exert several selective pressures which are likely to have evolutionary consequences for oaks, such as the selection of long-slim and large acorns with tight shells. In addition, oak seedlings with a long tap root and tough stem are selected for. Although other factors than mutual selective pressures between the two may have affected the present day fit between jays and oaks it is concluded that several characters of jays and oaks can be considered as co-adapted features of a symbiotic relationship.


2021 ◽  
pp. 002205742110319
Author(s):  
Sandra Levey

This review presents the Universal Design Learning (UDL) approach to education. Classrooms have become increasingly diverse, with second language learners, students with disabilities, and students with differences in their perception and understanding information. Some students learn best through listening, while others learn best when presented with visual information. Given the increased number of new language learners across the world, the UDL approach allows successful learning for all students. UDL has allowed students to acquire information more effectively. UDL provides guidance to educators that is especially valuable for the diversity of classrooms and the diversity in modalities in learning,


2018 ◽  
Vol 40 (1) ◽  
pp. 93-109
Author(s):  
YI ZHENG ◽  
ARTHUR G. SAMUEL

AbstractIt has been documented that lipreading facilitates the understanding of difficult speech, such as noisy speech and time-compressed speech. However, relatively little work has addressed the role of visual information in perceiving accented speech, another type of difficult speech. In this study, we specifically focus on accented word recognition. One hundred forty-two native English speakers made lexical decision judgments on English words or nonwords produced by speakers with Mandarin Chinese accents. The stimuli were presented as either as videos that were of a relatively far speaker or as videos in which we zoomed in on the speaker’s head. Consistent with studies of degraded speech, listeners were more accurate at recognizing accented words when they saw lip movements from the closer apparent distance. The effect of apparent distance tended to be larger under nonoptimal conditions: when stimuli were nonwords than words, and when stimuli were produced by a speaker who had a relatively strong accent. However, we did not find any influence of listeners’ prior experience with Chinese accented speech, suggesting that cross-talker generalization is limited. The current study provides practical suggestions for effective communication between native and nonnative speakers: visual information is useful, and it is more useful in some circumstances than others.


2003 ◽  
Vol 89 (1) ◽  
pp. 390-400 ◽  
Author(s):  
L. H. Zupan ◽  
D. M. Merfeld

Sensory systems often provide ambiguous information. For example, otolith organs measure gravito-inertial force (GIF), the sum of gravitational force and inertial force due to linear acceleration. However, according to Einstein's equivalence principle, a change in gravitational force due to tilt is indistinguishable from a change in inertial force due to translation. Therefore the central nervous system (CNS) must use other sensory cues to distinguish tilt from translation. For example, the CNS might use dynamic visual cues indicating rotation to help determine the orientation of gravity (tilt). This, in turn, might influence the neural processes that estimate linear acceleration, since the CNS might estimate gravity and linear acceleration such that the difference between these estimates matches the measured GIF. Depending on specific sensory information inflow, inaccurate estimates of gravity and linear acceleration can occur. Specifically, we predict that illusory tilt caused by roll optokinetic cues should lead to a horizontal vestibuloocular reflex compensatory for an interaural estimate of linear acceleration, even in the absence of actual linear acceleration. To investigate these predictions, we measured eye movements binocularly using infrared video methods in 17 subjects during and after optokinetic stimulation about the subject's nasooccipital (roll) axis (60°/s, clockwise or counterclockwise). The optokinetic stimulation was applied for 60 s followed by 30 s in darkness. We simultaneously measured subjective roll tilt using a somatosensory bar. Each subject was tested in three different orientations: upright, pitched forward 10°, and pitched backward 10°. Five subjects reported significant subjective roll tilt (>10°) in directions consistent with the direction of the optokinetic stimulation. In addition to torsional optokinetic nystagmus and afternystagmus, we measured a horizontal nystagmus to the right during and following clockwise (CW) stimulation and to the left during and following counterclockwise (CCW) stimulation. These measurements match predictions that subjective tilt in the absence of real tilt should induce a nonzero estimate of interaural linear acceleration and, therefore, a horizontal eye response. Furthermore, as predicted, the horizontal response in the dark was larger for Tilters ( n = 5) than for Non-Tilters ( n= 12).


Sign in / Sign up

Export Citation Format

Share Document