stimulus classes
Recently Published Documents


TOTAL DOCUMENTS

62
(FIVE YEARS 10)

H-INDEX

21
(FIVE YEARS 0)

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Martin Arguin ◽  
Roxanne Ferrandez ◽  
Justine Massé

AbstractIt is increasingly apparent that functionally significant neural activity is oscillatory in nature. Demonstrating the implications of this mode of operation for perceptual/cognitive function remains somewhat elusive. This report describes the technique of random temporal sampling for the investigation of visual oscillatory mechanisms. The technique is applied in visual recognition experiments using different stimulus classes (words, familiar objects, novel objects, and faces). Classification images reveal variations of perceptual effectiveness according to the temporal features of stimulus visibility. These classification images are also decomposed into their power and phase spectra. Stimulus classes lead to distinct outcomes and the power spectra of classification images are highly generalizable across individuals. Moreover, stimulus class can be reliably decoded from the power spectrum of individual classification images. These findings and other aspects of the results validate random temporal sampling as a promising new method to study oscillatory visual mechanisms.


2021 ◽  
Author(s):  
Adam F Osth ◽  
Simon Dennis

A powerful theoretical framework for exploring recognition memory is the global matchingframework, in which a cue’s memory strength reflects the similarity of the retrieval cuesbeing matched against the contents of memory simultaneously. Contributions at retrievalcan be categorized as matches and mismatches to the item and context cues, including theself match (match on item and context), item noise (match on context, mismatch on item),context noise (match on item, mismatch on context), and background noise (mismatch onitem and context). We present a model that directly parameterizes the matches andmismatches to the item and context cues, which enables estimation of the magnitude ofeach interference contribution (item noise, context noise, and background noise). Themodel was fit within a hierarchical Bayesian framework to ten recognition memory datasetsthat employ manipulations of strength, list length, list strength, word frequency, study-testdelay, and stimulus class in item and associative recognition. Estimates of the modelparameters revealed at most a small contribution of item noise that varies by stimulusclass, with virtually no item noise for single words and scenes. Despite the unpopularity ofbackground noise in recognition memory models, background noise estimates dominated atretrieval across nearly all stimulus classes with the exception of high frequency words,which exhibited equivalent levels of context noise and background noise. These parameterestimates suggest that the majority of interference in recognition memory stems fromexperiences acquired prior to the learning episode.


2021 ◽  
Author(s):  
Martin Arguin ◽  
Roxanne Ferrandez ◽  
Justine Massé

Abstract It is increasingly apparent that functionally significant neural activity is oscillatory in nature. Demonstrating the implications of this mode of operation for perceptual/cognitive function remains somewhat elusive. This report describes the technique of random temporal sampling for the investigation of visual oscillatory mechanisms. The technique is applied in visual recognition experiments using different stimulus classes (words, familiar objects, novel objects, and faces). Classification images reveal variations of perceptual effectiveness according to the temporal features of stimulus visibility. These classification images are also decomposed into their power and phase spectra. Stimulus classes lead to distinct outcomes and the power spectra of classification images are highly generalizable across individuals. Moreover, stimulus class can be reliably decoded from the power spectrum of individual classification images. These findings and other aspects of the results validate random temporal sampling as a promising new method to study oscillatory visual mechanisms.


2020 ◽  
Vol 36 (2) ◽  
pp. 273-294
Author(s):  
Eileen Pfeiffer Flores ◽  
Jorge Mendes de Oliveira-Castro ◽  
Carlos Barbosa Alves de Souza

AbstractWe offer an account of reading comprehension that we believe will help clarify some common conceptual confusions in the relevant literature, as well as contribute to existing functional accounts. We argue that defining texts qua texts as stimulus classes, on the one hand, and equating “comprehension” with behavior (covert or otherwise), on the other, are not useful conceptual moves, especially when behavioral settings go beyond basic literacy skills acquisition. We then analyze the structure of the contingencies that usually evoke talk of “comprehension” using techniques from analytic philosophy. We show how keeping the results of this analysis in mind can help avoid the conceptual bafflement that often arises, even among behavior analysts, when defining or assessing behavioral phenomena related to reading comprehension. Using two contrasting cases (legal texts and stories), we argue that what counts as comprehension depends, not peripherally but crucially, on the shared social practices of which texts are a part. Finally, we propose a new framework for classifying reader–text contingencies by combining two dimensions: openness of setting and embeddedness of reinforcement.


2019 ◽  
Author(s):  
Amelia R. Hunt ◽  
Arash Sahraie ◽  
Neil Macrae

It has long been proposed that some stimulus classes are so biologically important that they are automatically prioritized by the attention system, irrespective of context. However, issues of ecological validity undermine laboratory-based experiments that attempt to establish the existence of context-independent attentional biases. Here we measured attention to faces and facial expressions of emotion while participants were sitting in a waiting room before the experiment, and again in the same individuals in a laboratory-based reaction-time (RT) task. A robust bias towards images of faces was observed in the waiting room, but not in the RT task. Conversely, a robust attentional bias towards emotional faces was observed in the RT task, but not in the waiting room. Despite large individual differences in attentional biases towards face and facial emotions, measures of bias in a given individual in one setting did not predict their bias in another. We conclude that attentional capture by faces and facial emotions is highly sensitive to context.


2019 ◽  
Author(s):  
Jamie Cummins ◽  
Bryan Roche

Increasing evidence suggests that the relatedness of stimuli within the Function Acquisition Speed Test (FAST) methodology is sensitive to the learning histories of participants. For example, this method is sensitive to differences in the amount of baseline training provided to establish stimulus equivalence relations using arbitrary stimuli (Cummins et al., 2018a). However, it has not yet been investigated whether the relatedness of stimuli within the FAST varies based on differential nodal distances between stimuli within stimulus classes. If so, the FAST could serve an important adjunct assessment procedure for researchers who wish not only to assess the formation of stimulus classes using traditional methods, such as matching-to-sample, but also the relative relatedness of stimuli within complex stimulus classes (i.e., nodal distance). The current study sought to investigate this possibility. Participants (n = 16) were trained in the formation of two 4-member equivalence classes consisting of arbitrary nonsense syllables. Following this, participants completed three FAST assessments, each of which probed for the relatedness of stimulus pairs of differing nodal distance. Group- and individual-level analyses broadly demonstrated that relatedness varied as a function of nodal distances in pre-trained stimulus classes. However, results also highlighted some limitations of the FAST at the individual level.


2019 ◽  
pp. 124-137
Author(s):  
Harry A. Mackay ◽  
Barbara Jill Kotlarchyk ◽  
Robert Stromer

Sign in / Sign up

Export Citation Format

Share Document