Compositional Variability: The Key to the Social Signals Produced by Honeybee Mandibular Glands

2019 ◽  
pp. 318-322 ◽  
Author(s):  
Robin M. Crewe
2020 ◽  
Author(s):  
Abdulaziz Abubshait ◽  
Patrick P. Weis ◽  
Eva Wiese

Social signals, such as changes in gaze direction, are essential cues to predict others’ mental states and behaviors (i.e., mentalizing). Studies show that humans can mentalize with non-human agents when they perceive a mind in them (i.e., mind perception). Robots that physically and/or behaviorally resemble humans likely trigger mind perception, which enhances the relevance of social cues and improves social-cognitive performance. The current ex-periments examine whether the effect of physical and behavioral influencers of mind perception on social-cognitive processing is modulated by the lifelikeness of a social interaction. Participants interacted with robots of varying degrees of physical (humanlike vs. robot-like) and behavioral (reliable vs. random) human-likeness while the lifelikeness of a social attention task was manipulated across five experiments. The first four experiments manipulated lifelikeness via the physical realism of the robot images (Study 1 and 2), the biological plausibility of the social signals (Study 3), and the plausibility of the social con-text (Study 4). They showed that humanlike behavior affected social attention whereas appearance affected mind perception ratings. However, when the lifelikeness of the interaction was increased by using videos of a human and a robot sending the social cues in a realistic environment (Study 5), social attention mechanisms were affected both by physical appearance and behavioral features, while mind perception ratings were mainly affected by physical appearance. This indicates that in order to understand the effect of physical and behavioral features on social cognition, paradigms should be used that adequately simulate the lifelikeness of social interactions.


2010 ◽  
Vol 365 (1537) ◽  
pp. 165-176 ◽  
Author(s):  
Uta Frith ◽  
Chris Frith

The biological basis of complex human social interaction and communication has been illuminated through a coming together of various methods and disciplines. Among these are comparative studies of other species, studies of disorders of social cognition and developmental psychology. The use of neuroimaging and computational models has given weight to speculations about the evolution of social behaviour and culture in human societies. We highlight some networks of the social brain relevant to two-person interactions and consider the social signals between interacting partners that activate these networks. We make a case for distinguishing between signals that automatically trigger interaction and cooperation and ostensive signals that are used deliberately. We suggest that this ostensive signalling is needed for ‘closing the loop’ in two-person interactions, where the partners each know that they have the intention to communicate. The use of deliberate social signals can serve to increase reputation and trust and facilitates teaching. This is likely to be a critical factor in the steep cultural ascent of mankind.


2000 ◽  
Vol 11 (2) ◽  
pp. 106-111 ◽  
Author(s):  
Adam K. Anderson ◽  
Elizabeth A. Phelps

A growing body of evidence from humans and other animals suggests the amygdala may be a critical neural substrate for emotional processing. In particular, recent studies have shown that damage to the human amygdala impairs the normal appraisal of social signals of emotion, primarily those of fear. However, effective social communication depends on both the ability to receive (emotional appraisal) and the ability to send (emotional expression) signals of emotional state. Although the role of the amygdala in the appraisal of emotion is well established, its importance for the production of emotional expressions is unknown. We report a case study of a patient with bilateral amygdaloid damage who, despite a severe deficit in interpreting facial expressions of emotion including fear, exhibits an intact ability to express this and other basic emotions. This dissociation suggests that a single neural module does not support all aspects of the social communication of emotional state.


2009 ◽  
Vol 364 (1535) ◽  
pp. 3497-3504 ◽  
Author(s):  
Ursula Hess ◽  
Reginald B. Adams ◽  
Robert E. Kleck

Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.


2020 ◽  
Author(s):  
Won Lee ◽  
Hollie N. Dowd ◽  
Cyrus Nikain ◽  
Madeleine F. Dwortz ◽  
Eilene D. Yang ◽  
...  

AbstractCompetent social functioning of group-living species relies on the ability of individuals to detect and utilize conspecific social cues to guide behavior. Previous studies have identified numerous brain regions involved in processing these external cues, collectively referred to as the Social Decision-Making Network. However, how the brain encodes social information with respect to an individual’s social status has not been thoroughly examined. In mice, cues about an individual’s identity, including social status, are conveyed through urinary proteins. In this study, we assessed the neural cFos immunoreactivity in dominant and subordinate male mice exposed to familiar and unfamiliar dominant and subordinate male urine. The posteroventral medial amygdala was the only brain region that responded exclusively to dominant compared to subordinate male urine. In all other brain regions, including the VMH, PMv, and vlPAG, activity is modulated by a combination of odor familiarity and the social status of both the urine donor and the subject receiving the cue. We show that dominant subjects exhibit robust differential activity across different types of cues compared to subordinate subjects, suggesting that individuals perceive social cues differently depending on social experience. These data inform further investigation of neurobiological mechanisms underlying social-status related brain differences and behavior.


Author(s):  
Taemie Kim

Distributed collaborations tend to have different communication patterns and performances compared to co-located collaborations. This chapter proposes using sociometric feedback to understand and help distributed collaborations. The proposed system uses sociometric badges to automatically detect communication behaviors of groups and uses the information to provide real-time visual feedback. The goal of the feedback system is to encourage cooperation and furthermore improve the performance of both individuals and groups. This system could allow distributed collaborations to be more similar to co-located collaborations since it can sense and reintroduce the social signals lost in computer mediated communication. The chapter presents the results of two experimental laboratory studies that examine the effectiveness of the system. Results show that real-time sociometric feedback changes not only the communication patterns of distributed groups but also their performance, making it similar to that of co-located groups.


2011 ◽  
Vol 12 (3) ◽  
pp. 397-417 ◽  
Author(s):  
Hjalmar K. Turesson ◽  
Asif A. Ghazanfar

The social brain hypothesis implies that humans and other primates evolved “modules” for representing social knowledge. Alternatively, no such cognitive specializations are needed because social knowledge is already present in the world — we can simply monitor the dynamics of social interactions. Given the latter idea, what mechanism could account for coalition formation? We propose that statistical learning can provide a mechanism for fast and implicit learning of social signals. Using human participants, we compared learning of social signals with arbitrary signals. We found that learning of social signals was no better than learning of arbitrary signals. While coupling faces and voices led to parallel learning, the same was true for arbitrary shapes and sounds. However, coupling versus uncoupling social signals with arbitrary signals revealed that faces and voices are treated with perceptual priority. Overall, our data suggest that statistical learning is a viable domain-general mechanism for learning social group structure. Keywords: social brain; embodied cognition; distributed cognition; situated cognition; multisensory; audiovisual speech; crossmodal; multimodal


Author(s):  
Philippe Fossati ◽  
Sophie Hinfray ◽  
Anna Fall ◽  
Cédric Lemogne ◽  
Jean-Yves Rotge

Interpersonal factors are strong predictors of the onset and course of major depression. However, the biological and neural bases of interpersonal difficulties in major depression are unknown. In this chapter we describe a general homeostatic system that monitors the social acceptance of individuals. We show that this system is activated in response to actual or putative threats to social acceptance and signals of social rejection. Our model describes a cascade of cognitive, emotional, and behavioural consequences of social exclusion. The model emphasizes the role of specific regions—the subgenual anterior cingulate, the insula, and the default mode network—in the detection and regulation of social signals. Hence we propose that major depressive disorder is tightly linked to the processing of social exclusion and may represent a specific impairment in the homeostatic system that monitors social acceptance.


2020 ◽  
Author(s):  
Jared Martin ◽  
Adrienne Wood ◽  
William Taylor Laimaka Cox ◽  
Scott Sievert ◽  
Robert Nowak ◽  
...  

The present work advances the science of the smile by investigating how perceivers mentally represent this heterogenous expression. Across both perception- and production-based tasks, we report evidence that perceivers mentally represent reward, affiliation, and dominance smiles as distinct categories associated with specific behaviors, social contexts, and facial movements. Study 1 demonstrates that perceivers expect to behave differently in response to each type of smile when embedded in a simulated economic game. Study 2 demonstrates that perceivers use distinct words to describe the social contexts in which they anticipate encountering each type of smile. Study 3 demonstrates that producers use distinct facial movements when prompted with social contexts related to the theorized social function of each smile. Taken together, the present findings support the conclusion that reward, affiliation, and dominance smiles are mentally represented as distinct categories, bringing us one step closer to understanding smiles as nuanced social signals.


Sign in / Sign up

Export Citation Format

Share Document