perceptual category learning
Recently Published Documents


TOTAL DOCUMENTS

48
(FIVE YEARS 7)

H-INDEX

13
(FIVE YEARS 1)

2022 ◽  
Author(s):  
Roland Pusch ◽  
Julian Packheiser ◽  
Charlotte Koenen ◽  
Fabrizio Iovine ◽  
Onur Güntürkün

AbstractPigeons are classic model animals to study perceptual category learning. To achieve a deeper understanding of the cognitive mechanisms of categorization, a careful consideration of the employed stimulus material and a thorough analysis of the choice behavior is mandatory. In the present study, we combined the use of “virtual phylogenesis”, an evolutionary algorithm to generate artificial yet naturalistic stimuli termed digital embryos and a machine learning approach on the pigeons’ pecking responses to gain insight into the underlying categorization strategies of the animals. In a forced-choice procedure, pigeons learned to categorize these stimuli and transferred their knowledge successfully to novel exemplars. We used peck tracking to identify where on the stimulus the animals pecked and further investigated whether this behavior was indicative of the pigeon’s choice. Going beyond the classical analysis of the binary choice, we were able to predict the presented stimulus class based on pecking location using a k-nearest neighbor classifier, indicating that pecks are related to features of interest. By analyzing error trials with this approach, we further identified potential strategies of the pigeons to discriminate between stimulus classes. These strategies remained stable during category transfer, but differed between individuals indicating that categorization learning is not limited to a single learning strategy.


2021 ◽  
Author(s):  
Roland Pusch ◽  
Julian Packheiser ◽  
Charlotte Koenen ◽  
Fabrizio Iovine ◽  
Onur Guentuerkuen

Pigeons are classic model animals to study perceptual category learning. A theoretical understanding of the cognitive mechanisms of categorization requires a careful consideration of the employed stimulus material. Optimally, stimuli should not consist of real-world objects that might be associated with prior experience. The number of exemplars should be theoretically infinite and easy to produce. In addition, the experimenter should have the freedom to produce 2D- and 3D-versions of the stimuli and, finally, the stimulus set should provide the opportunity to identify the diagnostic elements that the animals use. To this end, we used the approach of "virtual phylogenesis" of "digital embryos" to produce two stimulus sets of objects that meet these criteria. In our experiment pigeons learned to categorize these stimuli in a forced-choice procedure. In addition, we used peck tracking to identify where on the stimulus the animals pecked to signal their choice. Pigeons learned the task and transferred successfully to novel exemplars. Using a k-nearest neighbor classifier, we were able to predict the presented stimulus class based on pecking location indicating that pecks are related to features of interest. We further identified potential strategies of the pigeons through this approach, namely that they were either learning one or two categories to discriminate between stimulus classes. These strategies remained stable during category transfer, but differed between individuals indicating that categorization learning is not limited to a single learning strategy.


2020 ◽  
Author(s):  
Casey L Roark ◽  
Giorgio Paulon ◽  
Abhra Sarkar ◽  
Bharath Chandrasekaran

Category learning is a fundamental process in human cognition that spans the senses. However, much still remains unknown about the mechanisms supporting learning in different modalities. In the current study, we directly compared auditory and visual category learning in the same individuals. Thirty participants (22 F; 18-32 years old) completed two unidimensional rule-based category learning tasks in a single day—one with auditory stimuli and another with visual stimuli. We replicated the results in a second experiment with a larger online sample (N = 99, 45 F, 18-35 years old). The categories were identically structured in the two modalities to facilitate comparison. We compared categorization accuracy, decision processes as assessed through drift-diffusion models, and the generalizability of resulting category representation through a generalization test. We found that individuals learned auditory and visual categories to similar extents and that accuracies were highly correlated across the two tasks. Participants had similar evidence accumulation rates in later learning, but early on had slower rates for visual than auditory learning. Participants also demonstrated differences in the decision thresholds across modalities. Participants had more categorical generalizable representations for visual than auditory categories. These results suggest that some modality-general cognitive processes support category learning but also suggest that the modality of the stimuli may also affect category learning behavior and outcomes.


2019 ◽  
Vol 19 (6) ◽  
pp. 20 ◽  
Author(s):  
Luke A. Rosedahl ◽  
F. Gregory Ashby

2017 ◽  
Vol 89 ◽  
pp. 31-38 ◽  
Author(s):  
George Cantwell ◽  
Maximilian Riesenhuber ◽  
Jessica L. Roeder ◽  
F. Gregory Ashby

Sign in / Sign up

Export Citation Format

Share Document