conversational cues
Recently Published Documents


TOTAL DOCUMENTS

13
(FIVE YEARS 5)

H-INDEX

6
(FIVE YEARS 2)

PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0258470
Author(s):  
Elyssa M. Barrick ◽  
Mark A. Thornton ◽  
Diana I. Tamir

Faces are one of the key ways that we obtain social information about others. They allow people to identify individuals, understand conversational cues, and make judgements about others’ mental states. When the COVID-19 pandemic hit the United States, widespread mask-wearing practices were implemented, causing a shift in the way Americans typically interact. This introduction of masks into social exchanges posed a potential challenge—how would people make these important inferences about others when a large source of information was no longer available? We conducted two studies that investigated the impact of mask exposure on emotion perception. In particular, we measured how participants used facial landmarks (visual cues) and the expressed valence and arousal (affective cues), to make similarity judgements about pairs of emotion faces. Study 1 found that in August 2020, participants with higher levels of mask exposure used cues from the eyes to a greater extent when judging emotion similarity than participants with less mask exposure. Study 2 measured participants’ emotion perception in both April and September 2020 –before and after widespread mask adoption—in the same group of participants to examine changes in the use of facial cues over time. Results revealed an overall increase in the use of visual cues from April to September. Further, as mask exposure increased, people with the most social interaction showed the largest increase in the use of visual facial cues. These results provide evidence that a shift has occurred in how people process faces such that the more people are interacting with others that are wearing masks, the more they have learned to focus on visual cues from the eye area of the face.


2021 ◽  
Vol 60 ◽  
pp. 102360
Author(s):  
Yingying Huang ◽  
Dogan Gursoy ◽  
Meng Zhang ◽  
Robin Nunkoo ◽  
Si Shi

2020 ◽  
Author(s):  
Elyssa M Barrick ◽  
Mark Allen Thornton ◽  
Diana Tamir

Faces are one of the key ways that we obtain social information about others. They allow people to identify individuals, understand conversational cues, and make judgements about other’s mental states. When the COVID-19 pandemic hit the United States, widespread mask-wearing practices were implemented, causing a shift in the way Americans typically interact. This introduction of masks into social exchanges posed a potential challenge – how would people make these important inferences about others when a large source of information was no longer available? We conducted two studies that investigated the impact of mask exposure on emotion perception. In particular, we measured how participants used facial landmarks (visual cues) and the expressed valence and arousal (affective cues), to make similarity judgements about pairs of emotion faces. Study 1 found that participants with higher levels of mask exposure used cues from the eyes to a greater extent when judging emotion similarity than participants with less mask exposure. Study 2 measured participants’ emotion perception in both April and September 2020 – before and after widespread mask adoption – in the same group of participants to examine changes in the use of facial cues over time. Results revealed an overall increase in the use of visual cues from April to September. Further, as mask exposure increased, people with the most social interaction showed the largest increase in the use of visual facial cues. These results provide evidence that a shift has occurred in how people process faces such that the more people are interacting with others that are wearing masks, the more they have learned to focus on visual cues from the eye area of the face.


Information ◽  
2020 ◽  
Vol 11 (1) ◽  
pp. 43 ◽  
Author(s):  
Eva Blessing Onyeulo ◽  
Vaibhav Gandhi

This paper discusses the nuances of a social robot, how and why social robots are becoming increasingly significant, and what they are currently being used for. This paper also reflects on the current design of social robots as a means of interaction with humans and also reports potential solutions about several important questions around the futuristic design of these robots. The specific questions explored in this paper are: “Do social robots need to look like living creatures that already exist in the world for humans to interact well with them?”; “Do social robots need to have animated faces for humans to interact well with them?”; “Do social robots need to have the ability to speak a coherent human language for humans to interact well with them?” and “Do social robots need to have the capability to make physical gestures for humans to interact well with them?”. This paper reviews both verbal as well as nonverbal social and conversational cues that could be incorporated into the design of social robots, and also briefly discusses the emotional bonds that may be built between humans and robots. Facets surrounding acceptance of social robots by humans and also ethical/moral concerns have also been discussed.


Author(s):  
Debsubhra Chakraborty ◽  
Shihao Xu ◽  
Zixu Yang ◽  
Yi Han Victoria Chua ◽  
Yasir Tahir ◽  
...  

Author(s):  
David Adamson ◽  
Akash Bharadwaj ◽  
Ashudeep Singh ◽  
Colin Ashe ◽  
David Yaron ◽  
...  

2011 ◽  
Vol 12 (4) ◽  
pp. 537-555 ◽  
Author(s):  
Jacqueline D. Woolley ◽  
Lili Ma ◽  
Gabriel Lopez-Mobilia
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document