trust in automation
Recently Published Documents


TOTAL DOCUMENTS

126
(FIVE YEARS 46)

H-INDEX

15
(FIVE YEARS 4)

2022 ◽  
pp. 910-929
Author(s):  
Johannes Maria Kraus ◽  
Yannick Forster ◽  
Sebastian Hergeth ◽  
Martin Baumann

Trust calibration takes place prior to and during system interaction along the available information. In an online study N = 519 participants were introduced to a conditionally automated driving (CAD) system and received different a priori information about the automation's reliability (low vs high) and brand of the CAD system (below average vs average vs above average reputation). Trust was measured three times during the study. Additionally, need for cognition (NFC) and other personality traits were assessed. Both heuristic brand information and reliability information influenced trust in automation. In line with the Elaboration Likelihood Model (ELM), participants with high NFC relied on the reliability information more than those with lower NFC. In terms of personality traits, materialism, the regulatory focus and the perfect automation scheme predicted trust in automation. These findings show that a priori information can influence a driver's trust in CAD and that such information is interpreted individually.


2021 ◽  
Author(s):  
Robert Hoffman ◽  
Shane T. Mueller ◽  
Gary Klein ◽  
Jordan Litman

Trust in automation, is of concern in computer science and cognitive systems engineering, as well as the popular media (e.g., Chancey et al., 2015; Hoff and Bashir 2015; Hoffman et al., 2009; Huynh et al., 2006; Naone, 2009; Merritt and Ilgen, 2008; Merritt et al. 2013, 2015a; Pop et al. 2015; Shadbolt, 2002; Wickens et al. ,2015; Woods and Hollnagel 2006). Trust is of particular concern as more AI systems are being developed and tested (Schaefer et al., 2016).


2021 ◽  
pp. 265-273
Author(s):  
Jonathan Soon Kiat Chua ◽  
Hong Xu ◽  
Sun Woh Lye

2021 ◽  
Vol 2 ◽  
Author(s):  
Sarah K. Hopko ◽  
Ranjana K. Mehta

Investigations into physiological or neurological correlates of trust has increased in popularity due to the need for a continuous measure of trust, including for trust-sensitive or adaptive systems, measurements of trustworthiness or pain points of technology, or for human-in-the-loop cyber intrusion detection. Understanding the limitations and generalizability of the physiological responses between technology domains is important as the usefulness and relevance of results is impacted by fundamental characteristics of the technology domains, corresponding use cases, and socially acceptable behaviors of the technologies. While investigations into the neural correlates of trust in automation has grown in popularity, there is limited understanding of the neural correlates of trust, where the vast majority of current investigations are in cyber or decision aid technologies. Thus, the relevance of these correlates as a deployable measure for other domains and the robustness of the measures to varying use cases is unknown. As such, this manuscript discusses the current-state-of-knowledge in trust perceptions, factors that influence trust, and corresponding neural correlates of trust as generalizable between domains.


Author(s):  
Sarah K. Hopko ◽  
Ranjana K. Mehta ◽  
Anthony D. McDonald

The adoption and appropriate utilization of automated subsystems is dependent on the acceptance, trust, and reliance in the automated subsystem and the systems as a whole. The differences in trust attitudes between vehicle, robot, medical devices, and cyber aids, as affected by dispositional and learned factors has not been studied. As such this paper employs an anonymous online survey to evaluate the contribution of these factors to trust by technology. The results indicate automation in medical devices are ranked as the most trusted, and the automation trust index is highest for automation in cyber aids, followed by medical devices. Vehicle automation and robot automation are the least trusted technologies by both measures. Furthermore, the largest contributors to trust index included familiarity with the technology, perceived importance and usefulness of the technology, and propensity to trust automation. This study illustrates the importance of considering demographics, attitudes, and experience in trust studies.


Author(s):  
Maya S. Luster ◽  
Brandon J. Pitts

In the field of Human Factors, the concept of trust in automation can help to explain how and why users interact with particular systems. One way to examine trust is through task performance and/or behavioral observations. Previous work has identified several system-related moderators of trust in automation, such as reliability and complexity. However, the effects of system certainty, i.e., the knowledge that a machine has regarding its own decision-making abilities, on trust remains unclear. The goal of this study was to examine the extent to which system certainty affects perceived trust. Participants performed a partially simulated flight task and decided what action to take in response to targets in the environment detected by the aircraft’s automation. The automation’s certainty levels in recognizing targets were 30%, 50%, and 80%. Overall, participants accepted the system’s recommendation regardless of the certainty level and trust in the system increased as the system’s certainty level increased. Results may help to inform the development of future autonomous systems.


Author(s):  
Daniela Miele ◽  
James Ferraro ◽  
Mustapha Mouloua

The goal of this study was to empirically examine the relationship between individuals’ reported trust in automated driving features and their level of self-confidence when driving. This study utilized a series of vignettes to depict three different levels of automation in accordance with the SAE International levels of Automation. The three levels portrayed low (level 1), moderate (level3), and high (level 5) functioning autonomous driving features. A driving self-efficacy scale and trust in automation scale were utilized to collect data about individuals attitudes towards the automation. It was hypothesized that self-confidence and level of automation would be significantly related to operator’s trust. In addition to this, it was also hypothesized that the level of automation would significantly affect the amount of trust placed in the autonomous features. Results indicated that there are significant relationships between self-confidence and trust, as well as level of automation and trust.


Author(s):  
T. Anderson ◽  
K. Fogarty ◽  
H. Kenkel ◽  
J. Raisigel ◽  
S. Zhou ◽  
...  

Search and rescue missions are time-sensitive, with their duration impacting survivability. Unmanned aerial vehicles (UAVs) are increasingly shortening response time, accelerating area coverage, and informing resource allocation. However, interactions of UAVs and human operators pose challenges, for example, related to understandability and trust in automation. This work seeks to facilitate human-machine teaming in designing an on-the-loop user experience with a constellation of UAVs as they narrow search areas by locating and triangulating mobile phone signals using dynamic co-fields autonomy. First, an abstraction-decomposition hierarchy is built to represent underlying values and requirements of the domain. Second, user interfaces are designed to reduce UAV and phone positional uncertainty over time, monitor power, communications, and other information per asset, and empower the operator to influence drone behavior. Their design includes spatiotemporal representations of search areas, UAV positions, communications signals, as well as notifications. Finally, user evaluation was conducted with domain and usability experts.


Author(s):  
Claire Textor ◽  
Richard Pak

As automation continues to pervade people’s lives, it is critical to understand the reasons why some interactions are successful while others fail. Previous research attempting to explain this variability in HAI through individual differences in working memory has been mixed. Research in cognitive psychology has demonstrated the importance of attention control as a fundamental mechanism underlying higher-order cognition. In the realm of automation, early work has demonstrated a link between attention control and performance (Foroughi et al., 2019). The purpose of this exploratory study was to investigate the relationship between attention control and attitudes towards automation, particularly trust. Our results found attention control to be correlated with propensity to trust and negative attitudes towards robots. These results encourage further inquiry into the role of attention control in HAI.


Author(s):  
X. Jessie Yang ◽  
Christopher Schemanske ◽  
Christine Searle

Objective We examine how human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Background Most existing studies measured trust by administering questionnaires at the end of an experiment. Only a limited number of studies viewed trust as a dynamic variable that can strengthen or decay over time. Method Seventy-five participants took part in an aided memory recognition task. In the task, participants viewed a series of images and later on performed 40 trials of the recognition task to identify a target image when it was presented with a distractor. In each trial, participants performed the initial recognition by themselves, received a recommendation from an automated decision aid, and performed the final recognition. After each trial, participants reported their trust on a visual analog scale. Results Outcome bias and contrast effect significantly influence human operators’ trust adjustments. An automation failure leads to a larger trust decrement if the final outcome is undesirable, and a marginally larger trust decrement if the human operator succeeds the task by him/herself. An automation success engenders a greater trust increment if the human operator fails the task. Additionally, automation failures have a larger effect on trust adjustment than automation successes. Conclusion Human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Their trust adjustments are significantly influenced by decision-making heuristics/biases. Application Understanding the trust adjustment process enables accurate prediction of the operators’ moment-to-moment trust in automation and informs the design of trust-aware adaptive automation.


Sign in / Sign up

Export Citation Format

Share Document