video camera system
Recently Published Documents


TOTAL DOCUMENTS

90
(FIVE YEARS 5)

H-INDEX

15
(FIVE YEARS 2)

2020 ◽  
Vol 32 (11) ◽  
pp. 3581
Author(s):  
Koji Abe ◽  
Shinichiro Kuroda ◽  
Hitoshi Habe

Autism ◽  
2020 ◽  
pp. 136236132095169 ◽  
Author(s):  
Roser Cañigueral ◽  
Jamie A Ward ◽  
Antonia F de C Hamilton

Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies. Lay abstract When we are communicating with other people, we exchange a variety of social signals through eye gaze and facial expressions. However, coordinated exchanges of these social signals can only happen when people involved in the interaction are able to see each other. Although previous studies report that autistic individuals have difficulties in using eye gaze and facial expressions during social interactions, evidence from tasks that involve real face-to-face conversations is scarce and mixed. Here, we investigate how eye gaze and facial expressions of typical and high-functioning autistic individuals are modulated by the belief in being seen by another person, and by being in a face-to-face interaction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video (no belief in being seen, no face-to-face), video-call (belief in being seen, no face-to-face) and face-to-face (belief in being seen and face-to-face). Typical participants gazed less to the confederate and made more facial expressions when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial expression patterns in autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial expressions as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies.


2020 ◽  
Vol 12 (3) ◽  
pp. 394 ◽  
Author(s):  
Donatus Bapentire Angnuureng ◽  
Philip-Neri Jayson-Quashigah ◽  
Rafael Almar ◽  
Thomas Christian Stieglitz ◽  
Edward Jamal Anthony ◽  
...  

Video camera systems have been used over nearly three decades to monitor coastal dynamics. They facilitate a high-frequency analysis of spatiotemporal shoreline mobility. Video camera usage to measure beach intertidal profile evolution has not been standardized globally and the capacity to obtain accurate results requires authentication using various techniques. Applications are mostly site specific due to differences in installation. The present study examines the accuracy of intertidal topographic data derived from a video camera system compared to data acquired with unmanned aerial vehicle (UAV, or drone) surveys of a reflective beach. Using one year of 15-min video data and one year of monthly UAV observations, the intertidal profile shows a good agreement. Underestimations of intertidal profile elevations by the camera-based method are possibly linked to the camera view angle, rectification and gaps in data. The resolution of the video-derived intertidal topographic profiles confirmed, however, the suitability of the method in providing beach mobility surveys matching those required for a quantitative analysis of nearshore changes. Beach slopes were found to vary between 0.1 and 0.7, with a steep slope in May to July 2018 and a gentle slope in December 2018. Large but short-scale beach variations occurred between August 2018 and October 2018 and corresponded to relatively high wave events. In one year, this dynamic beach lost 7 m. At this rate, and as also observed at other beaches nearby, important coastal facilities and infrastructure will be prone to erosion. The data suggest that a low-cost shore-based camera, particularly when used in a network along the coast, can produce profile data for effective coastal management in West Africa and elsewhere.


Author(s):  
Harumi Sugimatsu ◽  
Junichi Kojima ◽  
SungMin Nam ◽  
Tamaki Ura ◽  
Rajendar Bahl ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document