Effects of Depth Information on Visual Target Identification Task Performance in Shared Gaze Environments

2020 ◽  
Vol 26 (5) ◽  
pp. 1934-1944
Author(s):  
Austin Erickson ◽  
Nahal Norouzi ◽  
Kangsoo Kim ◽  
Joseph J. LaViola ◽  
Gerd Bruder ◽  
...  
Author(s):  
Shan G. Lakhmani ◽  
Julia L. Wright ◽  
Michael R. Schwartz ◽  
Daniel Barber

Human-robot interaction requires communication, however what form this communication should take to facilitate effective team performance is still undetermined. One notion is that effective human-agent communications can be achieved by combining transparent information-sharing techniques with specific communication patterns. This study examines how transparency and a robot’s communication patterns interact to affect human performance in a human-robot teaming task. Participants’ performance in a target identification task was affected by the robot’s communication pattern. Participants missed identifying more targets when they worked with a bidirectionally communicating robot than when they were working with a unidirectionally communicating one. Furthermore, working with a bidirectionally communicating robot led to fewer correct identifications than working with a unidirectionally communicating robot, but only when the robot provided less transparency information. The implications these findings have for future robot interface designs are discussed.


2009 ◽  
Vol 21 (1) ◽  
pp. 37-58 ◽  
Author(s):  
Stephanie Bryant ◽  
Uday Murthy ◽  
Patrick Wheeler

ABSTRACT: To facilitate the task of evaluating the internal control environment, auditors typically use internal control questionnaires (ICQ) to identify and document audit information. One drawback of structured ICQs is that beginning auditors charged with their completion could use them mechanistically, overlooking important cues that do not match ICQ prompts. We investigate the effects of cognitive style and feedback type on auditors' ability to identify internal control cues using ICQs. Student participants, proxying for beginning staff auditors with no experience, were classified as possessing either a sensor or an intuitive cognitive style. In an experiment, participants used an ICQ to identify internal control cues for one accounting cycle. After receiving varying kinds of feedback, participants repeated the internal control cue identification task using an ICQ for a second accounting cycle. Contrary to expectations, cognitive style did not significantly affect performance in the absence of feedback. As expected, significant associations between cognitive style and post-feedback task performance were found, with the combination of cognitive style and outcome feedback yielding positive performance improvements.


2010 ◽  
Vol 137 (3) ◽  
pp. 239-255 ◽  
Author(s):  
Stephen Rice ◽  
David Keller ◽  
David Trafimow ◽  
Joshua Sandry

Author(s):  
Adrian Rivera-Rodriguez ◽  
Maxwell Sherwood ◽  
Ahren B. Fitzroy ◽  
Lisa D. Sanders ◽  
Nilanjana Dasgupta

AbstractThis study measured event-related brain potentials (ERPs) to test competing hypotheses regarding the effects of anger and race on early visual processing (N1, P2, and N2) and error recognition (ERN and Pe) during a sequentially primed weapon identification task. The first hypothesis was that anger would impair weapon identification in a biased manner by increasing attention and vigilance to, and decreasing recognition and inhibition of weapon identification errors following, task-irrelevant Black (compared to White) faces. Our competing hypothesis was that anger would facilitate weapon identification by directing attention toward task-relevant stimuli (i.e., objects) and away from task-irrelevant stimuli (i.e., race), and increasing recognition and inhibition of biased errors. Results partially supported the second hypothesis, in that anger increased early attention to faces but minimized attentional processing of race, and did not affect error recognition. Specifically, angry (vs. neutral) participants showed increased N1 to both Black and White faces, ablated P2 race effects, and topographically restricted N2 race effects. Additionally, ERN amplitude was unaffected by emotion, race, or object type. However, Pe amplitude was affected by object type (but not emotion or race), such that Pe amplitude was larger after the misidentification of harmless objects as weapons. Finally, anger slowed overall task performance, especially the correct identification of harmless objects, but did not impact task accuracy. Task performance speed and accuracy were unaffected by the race of the face prime. Implications are discussed.


Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 279-279
Author(s):  
P Mäkelä ◽  
R Näsänen ◽  
J Rovamo

Identification sensitivity for four different faces was measured at the fovea and in the periphery to find out whether foveal and peripheral visual performance in this complex spatial task can be made equivalent simply by changing image magnification. Identification sensitivity was measured as a function of image magnification. The lowest contrast for identification was determined by a 4AFC method. Observers indicated via the keyboard which of the four faces was presented on the CRT screen. The images were shown monocularly at the fovea and at 2.5, 5, and 10 deg eccentricities in the nasal visual field of the right eye (eccentricity measured from the right- hand edge of the image). If scaling is successful, the foveal and peripheral sensitivity vs size functions collapse together when shifted along the size dimension only (Watson, 1987 Journal of the Optical Society of America A4 1579). Although the foveal and peripheral sensitivity functions could be superimposed, they did not fully superimpose without also shifting them in vertical direction, as foveal sensitivity at largest sizes was slightly superior to that of any eccentric location. Thus, size scaling alone was not adequate for this task, in agreement with the contrast sensitivity results of Valeton and Watson (1990 Perception19 Supplement, 393). In this identification task, performance deteriorated towards the periphery at approximately the same rate as visual acuity, when the size corresponding to half-maximal sensitivity at each eccentricity was used as a measure.


Perception ◽  
10.1068/p3126 ◽  
2001 ◽  
Vol 30 (7) ◽  
pp. 795-810 ◽  
Author(s):  
Melanie C Doyle ◽  
Robert J Snowden

Can auditory signals influence the processing of visual information? The present study examined the effects of simple auditory signals (clicks and noise bursts) whose onset was simultaneous with that of the visual target, but which provided no information about the target. It was found that such a signal enhances performance in the visual task: the accessory sound reduced response times for target identification with no cost to accuracy. The spatial location of the sound (whether central to the display or at the target location) did not modify this facilitation. Furthermore, the same pattern of facilitation was evident whether the observer fixated centrally or moved their eyes to the target. The results were not altered by changes in the contrast (and therefore visibility) of the visual stimulus or by the perceived utility of the spatial location of the sound. We speculate that the auditory signal may promote attentional ‘disengagement’ and that, as a result, observers are able to process the visual target sooner when sound accompanies the display relative to when visual information is presented alone.


2020 ◽  
Vol 34 (1) ◽  
pp. 17-47
Author(s):  
Minke J. de Boer ◽  
Deniz Başkent ◽  
Frans W. Cornelissen

Abstract The majority of emotional expressions used in daily communication are multimodal and dynamic in nature. Consequently, one would expect that human observers utilize specific perceptual strategies to process emotions and to handle the multimodal and dynamic nature of emotions. However, our present knowledge on these strategies is scarce, primarily because most studies on emotion perception have not fully covered this variation, and instead used static and/or unimodal stimuli with few emotion categories. To resolve this knowledge gap, the present study examined how dynamic emotional auditory and visual information is integrated into a unified percept. Since there is a broad spectrum of possible forms of integration, both eye movements and accuracy of emotion identification were evaluated while observers performed an emotion identification task in one of three conditions: audio-only, visual-only video, or audiovisual video. In terms of adaptations of perceptual strategies, eye movement results showed a shift in fixations toward the eyes and away from the nose and mouth when audio is added. Notably, in terms of task performance, audio-only performance was mostly significantly worse than video-only and audiovisual performances, but performance in the latter two conditions was often not different. These results suggest that individuals flexibly and momentarily adapt their perceptual strategies to changes in the available information for emotion recognition, and these changes can be comprehensively quantified with eye tracking.


Author(s):  
Yidu Lu ◽  
Nadine Sarter

Creating safe human-machine systems requires that operators can quickly notice changes in system reliability in the interest of trust calibration and proper automation usage. Operators’ readiness to trust a system is determined not only by the performance of the automation but also by their confidence in their own abilities. This study therefore compared the usefulness of feedback on the performance of either agent. The experiment required two groups of ten participants each to perform an automation-assisted target identification task with “Automation Performance Feedback” (APF) or “Operator Performance Feedback” (OPF). Four different scenarios differed with respect to the degree and duration of changes in system reliability. Findings indicate that APF was more effective for supporting timely adjustments of perceived system reliability, especially with large and long reliability changes. Subjective trust ratings and performance were not affected, however, suggesting that these two factors are closely linked and more relevant for automation reliance.


Sign in / Sign up

Export Citation Format

Share Document