scholarly journals Absence of cross-modality analogical transfer in perceptual categorization

2020 ◽  
pp. 3-13
Author(s):  
C.E.R. Edmunds ◽  
A.B. Inkster ◽  
P.M. Jones ◽  
F. Milton ◽  
A.J. Wills

Analogical transfer has been previously reported to occur between rule-based, but not information-integration, perceptual category structures (Casale, Roeder, & Ashby, 2012). The current study investigated whether a similar pattern of results would be observed in cross-modality transfer. Participants were trained on either a rule-based structure, or an information-integration structure, using visual stimuli. They were then tested on auditory stimuli that had the same underlying abstract category structure. Transfer performance was assessed relative to a control group who did not receive training on the visual stimuli. No cross-modality transfer was found, irrespective of the category structure employed.

2019 ◽  
Author(s):  
Charlotte E R Edmunds ◽  
Angus Inkster ◽  
Peter M Jones ◽  
Fraser Milton ◽  
Andy Wills

Analogical transfer has been previously reported to occur between rule-based, but not information-integration, perceptual category structures (Casale, Roeder, & Ashby, 2012). The current study investigated whether a similar pattern of results would be observed in cross-modality transfer. Participants were trained on either a rule-based structure, or an information-integration structure, using visual stimuli. They were then tested on auditory stimuli that had the same underlying abstract category structure. Transfer performance was assessed relative to a control group who did not receive training on the visual stimuli. No cross-modality transfer was found, irrespective of the category structure employed.


2008 ◽  
Vol 19 (11) ◽  
pp. 1169-1177 ◽  
Author(s):  
Brian J. Spiering ◽  
F. Gregory Ashby

Previous research has disagreed about whether a difficult cognitive skill is best learned by beginning with easy or difficult examples. Two experiments that clarify this debate are reported. Participants in both experiments received one of three types of training on a difficult perceptual categorization task. In one condition, participants began with easy examples, then moved to examples of intermediate difficulty, and finished with the most difficult examples. In a second condition, this order was reversed, and in a third condition, participants saw examples in a random order. The results depended on the type of categories that participants were learning. When the categories could be learned via explicit reasoning (a rule-based task), the three training procedures were equally effective. However, when the categorization rule was difficult to describe verbally (an information-integration task), participants who began with the most difficult items performed much better than participants in the other two conditions.


Animals ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 2233
Author(s):  
Loïc Pougnault ◽  
Hugo Cousillas ◽  
Christine Heyraud ◽  
Ludwig Huber ◽  
Martine Hausberger ◽  
...  

Attention is defined as the ability to process selectively one aspect of the environment over others and is at the core of all cognitive processes such as learning, memorization, and categorization. Thus, evaluating and comparing attentional characteristics between individuals and according to situations is an important aspect of cognitive studies. Recent studies showed the interest of analyzing spontaneous attention in standardized situations, but data are still scarce, especially for songbirds. The present study adapted three tests of attention (towards visual non-social, visual social, and auditory stimuli) as tools for future comparative research in the European starling (Sturnus vulgaris), a species that is well known to present individual variations in social learning or engagement. Our results reveal that attentional characteristics (glances versus gazes) vary according to the stimulus broadcasted: more gazes towards unusual visual stimuli and species-specific auditory stimuli and more glances towards species-specific visual stimuli and hetero-specific auditory stimuli. This study revealing individual variations shows that these tests constitute a very useful and easy-to-use tool for evaluating spontaneous individual attentional characteristics and their modulation by a variety of factors. Our results also indicate that attentional skills are not a uniform concept and depend upon the modality and the stimulus type.


1954 ◽  
Vol 100 (419) ◽  
pp. 462-477 ◽  
Author(s):  
K. R. L. Hall ◽  
E. Stride

A number of studies on reaction time (R.T.) latency to visual and auditory stimuli in psychotic patients has been reported since the first investigations on the personal equation were carried out. The general trends from the work up to 1943 are well summarized by Hunt (1944), while Granger's (1953) review of “Personality and visual perception” contains a summary of the studies on R.T. to visual stimuli.


2018 ◽  
Vol 7 ◽  
pp. 172-177
Author(s):  
Łukasz Tyburcy ◽  
Małgorzata Plechawska-Wójcik

The paper describes results of comparison of reactions times to visual and auditory stimuli using EEG evoked potentials. Two experiments were used to applied. The first one explored reaction times to visual stimulus and the second one to auditory stimulus. After conducting an analysis of data, received results enable determining that visual stimuli evoke faster reactions than auditory stimuli.


2016 ◽  
Author(s):  
Drew Altschul ◽  
Greg Jensen ◽  
Herbert S Terrace

Humans are highly adept at categorizing visual stimuli, but studies of human categorization are typically validated by verbal reports. This makes it difficult to perform comparative studies of categorization using non-human animals. Interpretation of comparative studies is further complicated by the possibility that animal performance may merely reflect reinforcement learning, whereby discrete features act as discriminative cues for categorization. To assess and compare how humans and monkeys classified visual stimuli, we trained 7 rhesus macaques and 41 human volunteers to respond, in a specific order, to four simultaneously presented stimuli at a time, each belonging to a different perceptual category. These exemplars were drawn at random from large banks of images, such that the stimuli presented changed on every trial. Subjects nevertheless identified and ordered these changing stimuli correctly. Three monkeys learned to order naturalistic photographs; four others, close-up sections of paintings with distinctive styles. Humans learned to order both types of stimuli. All subjects classified stimuli at levels substantially greater than that predicted by chance or by feature-driven learning alone, even when stimuli changed one every trial. However, humans more closely resembled monkeys when classifying the more abstract painting stimuli than the photographic stimuli. This points to a common classification strategy in both species, once that humans can rely on in the absence of linguistic labels for categories.


2010 ◽  
Vol 48 (10) ◽  
pp. 2998-3008 ◽  
Author(s):  
W. Todd Maddox ◽  
Jennifer Pacheco ◽  
Maia Reeves ◽  
Bo Zhu ◽  
David M. Schnyer

Sign in / Sign up

Export Citation Format

Share Document