scholarly journals Perspective influences eye movements during real-life conversation: Mentalising about self vs. others in autism

2019 ◽  
Author(s):  
Mahsa Barzy ◽  
Heather Jane Ferguson ◽  
David Williams

Socio-communication is profoundly impaired among autistic individuals. Difficulties representing others’ mental states have been linked to modulations of gaze and speech, which have also been shown to be impaired in autism. Despite these observed impairments in ‘real-world’ communicative settings, research has mostly focused on lab-based experiments, where the language is highly structured. In a pre-registered experiment, we recorded eye movements and verbal responses while adults (N=50) engaged in a real-life conversation. Conversation topic either related to the self, a familiar other, or an unfamiliar other (e.g. "Tell me who is your/your mother’s/Marina’s favourite celebrity and why?”). Results replicated previous work, showing reduced attention to socially-relevant information among autistic participants (i.e. less time looking at the experimenter’s face, and more time looking around the background), compared to typically-developing controls. Importantly, perspective modulated social attention in both groups; talking about an unfamiliar other reduced attention to potentially distracting or resource demanding social information, and increased looks to non-social background. Social attention did not differ between self and familiar other contexts- reflecting greater shared knowledge for familiar/similar others. Autistic participants spent more time looking at the background when talking about an unfamiliar other vs. themselvesFuture research should investigate the cognitive mechanisms underlying this effect.

Autism ◽  
2020 ◽  
Vol 24 (8) ◽  
pp. 2153-2165
Author(s):  
Mahsa Barzy ◽  
Heather J Ferguson ◽  
David M Williams

Social-communication is profoundly impaired among autistic individuals. Difficulties representing others’ mental states have been linked to modulations of gaze and speech, which have also been shown to be impaired in autism. Despite these observed impairments in ‘real-world’ communicative settings, research has mostly focused on lab-based experiments, where the language is highly structured. In a pre-registered experiment, we recorded eye movements and verbal responses while adults ( N = 50) engaged in a real-life conversation. Using a novel approach, we also manipulated the perspective that participants adopted by asking them questions that were related to the self, a familiar other, or an unfamiliar other. Results replicated previous work, showing reduced attention to socially relevant information among autistic participants (i.e. less time looking at the experimenter’s face and more time looking around the background), compared to typically developing controls. Importantly, perspective modulated social attention in both groups; talking about an unfamiliar other reduced attention to potentially distracting or resource-demanding social information and increased looks to non-social background. Social attention did not differ between self and familiar other contexts, reflecting greater shared knowledge for familiar/similar others. Autistic participants spent more time looking at the background when talking about an unfamiliar other versus themselves. Future research should investigate the developmental trajectory of this effect and the cognitive mechanisms underlying it. Lay abstract Previous lab-based studies suggest that autistic individuals are less attentive to social aspects of their environment. In our study, we recorded the eye movements of autistic and typically developing adults while they engaged in a real-life social interaction with a partner. Results showed that autistic adults were less likely than typically developing adults to look at the experimenter’s face, and instead were more likely to look at the background. Moreover, the perspective that was adopted in the conversation (talking about self versus others) modulated the patterns of eye movements in autistic and non-autistic adults. Overall, people spent less time looking at their conversation partner’s eyes and face and more time looking at the background, when talking about an unfamiliar other compared to when talking about themselves. This pattern was magnified among autistic adults. We conclude that allocating attention to social information during conversation is cognitively effortful, but this can be mitigated when talking about a topic that is familiar to them.


2021 ◽  
Author(s):  
Shaheed Azaad ◽  
Günther Knoblich ◽  
Natalie Sebanz

Even the simplest social interactions require us to gather, integrate, and act upon, multiple streams of information about others and our surroundings. In this Element, we discuss how perceptual processes provide us with an accurate account of action-relevant information in social contexts. We overview contemporary theories and research that explores how: (1) individuals perceive others' mental states and actions, (2) individuals perceive affordances for themselves, others, and the dyad, and (3) how social contexts guide our attention to modulate what we perceive. Finally, we review work on the cognitive mechanisms that make joint action possible and discuss their links to perception.


2015 ◽  
Vol 37 (1) ◽  
pp. 49-67 ◽  
Author(s):  
MAJA ROCH ◽  
ELENA FLORIT ◽  
CHIARA LEVORATO

ABSTRACTThe study explored narrative production and comprehension in typically developing Italian–English sequential bilinguals. Thirty 5- to 6-year-olds and 32 6- to 7-year-olds were presented with story telling and retelling tasks, each followed by comprehension questions in Italian (their first language) and English (their second language). The macrostructure of narratives produced was analyzed, considering total amount of relevant information, story complexity, and mental state terms. Comprehension questions focused on implicit story information (i.e., characters’ mental states and goals). The results indicated that (a) older children outperformed younger ones on all measures; (b) an advantage of first language (Italian) over second language (English) emerged for younger children; and (c) comprehension and production were both more accurate in story retelling than in telling. Theoretical and methodological implications of these results are discussed.


2021 ◽  
Author(s):  
Tawny Tsang ◽  
Shulamite Green ◽  
Janelle Liu ◽  
Katherine Lawrence ◽  
Shafali Jeste ◽  
...  

Converging evidence implicates disrupted brain connectivity in autism spectrum disorder (ASD); however, the mechanisms linking altered connectivity early in development to the emergence of ASD symptomatology remain poorly understood. Here we examined whether atypicalities in the Salience Network (SN) -- an early-emerging neural network involved in orienting attention to the most salient aspects of one's internal and external environment -- may predict the development of ASD markers such as reduced social attention and atypical sensory processing. Six-week-old infants at high-risk for ASD exhibited stronger SN connectivity with sensorimotor regions; low-risk infants demonstrated stronger SN connectivity with prefrontal regions involved in social attention. Infants with higher connectivity with sensorimotor regions had lower connectivity with prefrontal regions, suggesting a direct tradeoff between attention to basic sensory versus socially-relevant information. Early alterations in SN connectivity predicted subsequent ASD symptomatology, providing a plausible mechanistic account for the unfolding of atypical developmental trajectories associated with ASD risk.


2021 ◽  
Vol 15 ◽  
Author(s):  
Alice Adiletta ◽  
Samantha Pedrana ◽  
Orsola Rosa-Salva ◽  
Paola Sgadò

Faces convey a great amount of socially relevant information related to emotional and mental states, identity and intention. Processing of face information is a key mechanism for social and cognitive development, such that newborn babies are already tuned to recognize and orient to faces and simple schematic face-like patterns since the first hours of life. Similar to neonates, also non-human primates and domestic chicks have been shown to express orienting responses to faces and schematic face-like patterns. More importantly, existing studies have hypothesized that early disturbances of these mechanisms represent one of the earliest biomarker of social deficits in autism spectrum disorders (ASD). We used VPA exposure to induce neurodevelopmental changes associated with ASD in domestic chicks and tested whether VPA could impact the expression of the animals’ approach responses to schematic face-like stimuli. We found that VPA impairs the chicks’ preference responses to these social stimuli. Based on the results shown here and on previous studies, we propose the domestic chick as animal model to investigate the biological mechanisms underlying face processing deficits in ASD.


Author(s):  
Toby J. Lloyd-Jones ◽  
Juergen Gehrke ◽  
Jason Lauder

We assessed the importance of outline contour and individual features in mediating the recognition of animals by examining response times and eye movements in an animal-object decision task (i.e., deciding whether or not an object was an animal that may be encountered in real life). There were shorter latencies for animals as compared with nonanimals and performance was similar for shaded line drawings and silhouettes, suggesting that important information for recognition lies in the outline contour. The most salient information in the outline contour was around the head, followed by the lower torso and leg regions. We also observed effects of object orientation and argue that the usefulness of the head and lower torso/leg regions is consistent with a role for the object axis in recognition.


2020 ◽  
Author(s):  
Abdulaziz Abubshait ◽  
Patrick P. Weis ◽  
Eva Wiese

Social signals, such as changes in gaze direction, are essential cues to predict others’ mental states and behaviors (i.e., mentalizing). Studies show that humans can mentalize with non-human agents when they perceive a mind in them (i.e., mind perception). Robots that physically and/or behaviorally resemble humans likely trigger mind perception, which enhances the relevance of social cues and improves social-cognitive performance. The current ex-periments examine whether the effect of physical and behavioral influencers of mind perception on social-cognitive processing is modulated by the lifelikeness of a social interaction. Participants interacted with robots of varying degrees of physical (humanlike vs. robot-like) and behavioral (reliable vs. random) human-likeness while the lifelikeness of a social attention task was manipulated across five experiments. The first four experiments manipulated lifelikeness via the physical realism of the robot images (Study 1 and 2), the biological plausibility of the social signals (Study 3), and the plausibility of the social con-text (Study 4). They showed that humanlike behavior affected social attention whereas appearance affected mind perception ratings. However, when the lifelikeness of the interaction was increased by using videos of a human and a robot sending the social cues in a realistic environment (Study 5), social attention mechanisms were affected both by physical appearance and behavioral features, while mind perception ratings were mainly affected by physical appearance. This indicates that in order to understand the effect of physical and behavioral features on social cognition, paradigms should be used that adequately simulate the lifelikeness of social interactions.


Author(s):  
Hsein Kew

AbstractIn this paper, we propose a method to generate an audio output based on spectroscopy data in order to discriminate two classes of data, based on the features of our spectral dataset. To do this, we first perform spectral pre-processing, and then extract features, followed by machine learning, for dimensionality reduction. The features are then mapped to the parameters of a sound synthesiser, as part of the audio processing, so as to generate audio samples in order to compute statistical results and identify important descriptors for the classification of the dataset. To optimise the process, we compare Amplitude Modulation (AM) and Frequency Modulation (FM) synthesis, as applied to two real-life datasets to evaluate the performance of sonification as a method for discriminating data. FM synthesis provides a higher subjective classification accuracy as compared with to AM synthesis. We then further compare the dimensionality reduction method of Principal Component Analysis (PCA) and Linear Discriminant Analysis in order to optimise our sonification algorithm. The results of classification accuracy using FM synthesis as the sound synthesiser and PCA as the dimensionality reduction method yields a mean classification accuracies of 93.81% and 88.57% for the coffee dataset and the fruit puree dataset respectively, and indicate that this spectroscopic analysis model is able to provide relevant information on the spectral data, and most importantly, is able to discriminate accurately between the two spectra and thus provides a complementary tool to supplement current methods.


Author(s):  
Christian Wolf ◽  
Markus Lappe

AbstractHumans and other primates are equipped with a foveated visual system. As a consequence, we reorient our fovea to objects and targets in the visual field that are conspicuous or that we consider relevant or worth looking at. These reorientations are achieved by means of saccadic eye movements. Where we saccade to depends on various low-level factors such as a targets’ luminance but also crucially on high-level factors like the expected reward or a targets’ relevance for perception and subsequent behavior. Here, we review recent findings how the control of saccadic eye movements is influenced by higher-level cognitive processes. We first describe the pathways by which cognitive contributions can influence the neural oculomotor circuit. Second, we summarize what saccade parameters reveal about cognitive mechanisms, particularly saccade latencies, saccade kinematics and changes in saccade gain. Finally, we review findings on what renders a saccade target valuable, as reflected in oculomotor behavior. We emphasize that foveal vision of the target after the saccade can constitute an internal reward for the visual system and that this is reflected in oculomotor dynamics that serve to quickly and accurately provide detailed foveal vision of relevant targets in the visual field.


2014 ◽  
Vol 9 ◽  
Author(s):  
Roberto Tramarin ◽  
Mario Polverino ◽  
Maurizio Volterrani ◽  
Bruna Girardi ◽  
Claudio Chimini ◽  
...  

Background: Cardiovascular and respiratory diseases are leading causes of morbidity and their co-occurrence has important implications in mortality and other outcomes. Even the most recent guidelines do not reliably address clinical, prognostic, and therapeutic concerns due to the overlap of respiratory and cardiac diseases. Study objectives and design: In order to evaluate in the reality of clinical practice the epidemiology and the reciprocal impact of cardio-pulmonary comorbidity on the clinical management, diagnostic workup and treatment, 1,500 cardiac and 1,500 respiratory inpatients, admitted in acute and rehabilitation units, will be enrolled in a multicenter, nationwide, prospective observational study. For this purpose, each center will enroll at least 50 consecutive patients. At discharge, data analysis will be aimed at the definition of cardiac and pulmonary inpatient comorbidity prevalence, demographic characteristics, length of hospital stay, and risk factors, taking into account also procedures, pharmacological and non-pharmacological treatment, and follow up in patients with cardio-respiratory comorbidity. Conclusions: The purely observational design of the study aims to give new relevant information on the assessment and management of overlapping patients in real life clinical practice, and new insight for improvement and implementation of current guidelines on the management of individual diseases.


Sign in / Sign up

Export Citation Format

Share Document