scholarly journals The correct way to test the hypothesis that racial categorization is a byproduct of an evolved alliance-tracking capacity

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
David Pietraszewski

AbstractThe project of identifying the cognitive mechanisms or information-processing functions that cause people to categorize others by their race is one of the longest-standing and socially-impactful scientific issues in all of the behavioral sciences. This paper addresses a critical issue with one of the few hypotheses in this area that has thus far been successful—the alliance hypothesis of race—which had predicted a set of experimental circumstances that appeared to selectively target and modify people’s implicit categorization of others by their race. Here, we will show why the evidence put forward in favor of this hypothesis was not in fact evidence in support of the hypothesis, contrary to common understanding. We will then provide the necessary and crucial tests of the hypothesis in the context of conflictual alliances, determining if the predictions of the alliance hypothesis of racial categorization in fact hold up to experimental scrutiny. When adequately tested, we find that indeed categorization by race is selectively reduced when crossed with membership in antagonistic alliances—the very pattern predicted by the alliance hypothesis. This finding provides direct experimental evidence that the human mind treats race as proxy for alliance membership, implying that racial categorization does not reflect attention to physical features per se, but rather to social relationships.

2021 ◽  
Vol 17 (2) ◽  
pp. 212-227
Author(s):  
Amelie Berger Soraruff

Abstract French philosopher Bernard Stiegler inscribes himself in the tradition of critical theory. In this respect, the influence of Adorno and Horkheimer has been crucial to the development of his own understanding of cinema. Yet Stiegler reproaches his predecessors for not having stressed enough the positive virtues of cinema on culture. For Stiegler the industry of cinema is not simply a menace to the human mind, but a positive medium for its reinvention. It is in that sense that cinema is pharmacological, insofar as it can be either spiritually and culturally enhancing or destructive, depending on how it is acted on. As the article concludes, Stiegler's pharmacology of cinema invites us to take part in our cinematic cultural becoming through the revival of the figure of the amateur. But it does so at the risk of cultural snobbery. While Stiegler does not condemn the cinematic medium per se, he does express clear reservations on the potential of commercial cinema, the pharmacological critique of which remains to be thought.


2007 ◽  
Vol 37 (9) ◽  
pp. 1281-1291 ◽  
Author(s):  
STELLA W. Y. CHAN ◽  
GUY M. GOODWIN ◽  
CATHERINE J. HARMER

ABSTRACTBackgroundCognitive theories associate depression with negative biases in information processing. Although negatively biased cognitions are well documented in depressed patients and to some extent in recovered patients, it remains unclear whether these abnormalities are present before the first depressive episode.MethodHigh neuroticism (N) is a well-recognized risk factor for depression. The current study therefore compared different aspects of emotional processing in 33 high-N never-depressed and 32 low-N matched volunteers. Awakening salivary cortisol, which is often elevated in severely depressed patients, was measured to explore the neurobiological substrate of neuroticism.ResultsHigh-N volunteers showed increased processing of negative and/or decreased processing of positive information in emotional categorization and memory, facial expression recognition and emotion-potentiated startle (EPS), in the absence of global memory or executive deficits. By contrast, there was no evidence for effects of neuroticism on attentional bias (as measured with the dot-probe task), over-general autobiographical memory, or awakening cortisol levels.ConclusionsThese results suggest that certain negative processing biases precede depression rather than arising as a result of depressive experience per se and as such could in part mediate the vulnerability of high-N subjects to depression. Longitudinal studies are required to confirm that such cognitive vulnerabilities predict subsequent depression in individual subjects.


Sofia ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 124-145 ◽  
Author(s):  
Diego Azevedo Leite

One of the central aims of the neo-mechanistic framework for the neural and cognitive sciences is to construct a pluralistic integration of scientific explanations, allowing for a weak explanatory autonomy of higher-level sciences, such as cognitive science. This integration involves understanding human cognition as information processing occurring in multi-level human neuro-cognitive mechanisms, explained by multi-level neuro-cognitive models. Strong explanatory neuro-cognitive reduction, however, poses a significant challenge to this pluralist ambition and the weak autonomy of cognitive science derived therefrom. Based on research in current molecular and cellular neuroscience, the framework holds that the best strategy for integrating human neuro-cognitive theories is through direct reductive explanations based on molecular and cellular neural processes. It is my aim to investigate whether the neo-mechanistic framework can meet the challenge. I argue that leading neo-mechanists offer some significant replies; however, they are not able yet to completely remove strong explanatory reductionism from their own framework.


2015 ◽  
Author(s):  
Roberto Maffei ◽  
Livia S Convertini ◽  
Sabrina Quatraro ◽  
Stefania Ressa ◽  
Annalisa Velasco

Background. Interpretation is the process through which humans attribute meanings to every input they grasp from their natural or social environment. Formulation and exchange of meanings through natural language are basic aspects of human behaviour and important neuroscience subjects; from long ago, they are the object of dedicated scientific research. Two main theoretical positions (cognitivism and embodied cognition) are at present confronting each other; however, available data is not conclusive and scientific knowledge of the interpretation process is still unsatisfactory. Our work proposes some contributions aimed to improve it. Methodology. Our field research involved a random sample of 102 adults. We presented them a real world-like case of written communication using unabridged message texts. We collected data (written accounts by participants about their interpretations) in controlled conditions through a specially designed questionnaire (closed and opened answers). Finally, we carried out qualitative and quantitative analyses through some fundamental statistics. Principal Findings. While readers are expected to concentrate on the text’s content, they rather report focusing on the most varied and unpredictable components: certain physical features of the message (e.g. the message’s period lengths) as well as meta-information like the position of a statement or even the lack of some content. Just about 12% of the participants' indications point directly at the text's content. Our data converge on the hypothesis that the components of a message work at first like physical stimuli, causing readers' automatic (body level) reactions independent of the conscious attribution of meaning. So, interpretation would be a (learned) stimulus-reaction mechanism, before switching to information processing, and the basis of meaning could be perceptual/analogical, before propositional/digital. We carried out a first check of our hypothesis: the employed case contained the emerging of a conflict and two versions (“H” and “S”, same content, different forms) of a reply to be sent at a crucial point. We collected the participants’ (independent) interpretations of the two versions; then, we asked them to choose which one could solve the conflict; finally, we assessed the coherence between interpretations and choice on a 4-level scale. The analysis of the coherence levels' distribution returned that, with regards to our expectations, incoherence levels are over-represented; such imbalance is totally ascribable to “H” choosers. “H” and “S” choosers show significant differences (p<<0.01) in the distributions of coherence levels, what is inconsistent with the traditional hypothesis of a linear information processing resulting in the final choice. In the end, with respect to the currently opposing theories, we found out that our hypothesis has either important convergences or at least one critical divergence, joined with the capacity to encompass they both.


Author(s):  
Jochen Rau

Recent advances in quantum technology – from quantum computers and simulators to communication and metrology – have not only opened up a whole new world of applications but also changed the understanding of quantum theory itself. This text introduces quantum theory entirely from this new perspective. It does away with the traditional approach to quantum theory as a theory of microscopic matter, and focuses instead on quantum theory as a framework for information processing. Accordingly, the emphasis is on concepts like measurement, probability, statistical correlations, and transformations, rather than waves and particles. The text begins with experimental evidence that forces one to abandon the classical description and to re-examine such basic notions as measurement, probability, and state. Thorough investigation of these concepts leads to the alternative framework of quantum theory. The requisite mathematics is developed and linked to its operational meaning. This part of the text culminates in an exploration of some of the most vexing issues of quantum theory, regarding locality, non-contextuality, and realism. The second half of the text explains how the peculiar features of quantum theory are harnessed to tackle information processing tasks that are intractable or even impossible classically. It provides the tools for understanding and designing the pertinent protocols, and discusses a range of examples representative of current quantum technology.


Author(s):  
Xenia Naidenova

This chapter offers a view on the history of developing the concepts of knowledge and human reasoning both in mathematics and psychology. Mathematicians create the formal theories of correct thinking; psychologists study the cognitive mechanisms that underpin knowledge construction and thinking as the most important functions of human existence. They study how the human mind works. The progress in understanding human knowledge and thinking will be undoubtedly related to combining the efforts of scientists in these different disciplines. Believing that it is impossible to study independently the problems of knowledge and human reasoning we strive to cover in this chapter the central ideas of knowledge and logical inference that have been manifested in the works of outstanding thinkers and scientists of past time. These ideas reveal all the difficulties and obstacles on the way to comprehending the human mental processes.


2000 ◽  
Vol 21 (3) ◽  
pp. 425-426 ◽  
Author(s):  
Diane Beals

Vygotsky's social psycholinguist approach is not incompatible with computational approaches to the study of mind. In this way William Frawley sets the stage for a Vygoskyan cognitive science. Socioculturalists' theorizing on the work of the human mind has long maintained boundaries against cognitive science's information processing approaches and language, and vice versa. Frawley argues that no such division is entirely necessary and offers powerful ways of linking the two ways of thinking. Frawley's background in both Vygotskyan and other sociocultural theories, as well as in cognitive science and computational theories, places him in an important position to make these links.


2010 ◽  
Vol 33 (4) ◽  
pp. 280-281 ◽  
Author(s):  
Colin Klein

AbstractAnderson's meta-analysis of fMRI data is subject to a potential confound. Areas identified as active may make no functional contribution to the task being studied, or may indicate regions involved in the coordination of functional networks rather than information processing per se. I suggest a way in which fMRI adaptation studies might provide a useful test between these alternatives.


2001 ◽  
Vol 24 (5) ◽  
pp. 812-813
Author(s):  
Roman Borisyuk

Experimental evidence and mathematical/computational models show that in many cases chaotic, nonregular oscillations are adequate to describe the dynamical behaviour of neural systems. Further work is needed to understand the meaning of this dynamical regime for modelling information processing in the brain.


Sign in / Sign up

Export Citation Format

Share Document