in automation we trust identifying factors that influence
play

In Automation we Trust? Identifying Factors that Influence Trust and - PowerPoint PPT Presentation

In Automation we Trust? Identifying Factors that Influence Trust and Reliance in Automated and Human Decision Aids This material is based upon work supported in whole or in part with funding from the Laboratory for Analytic Sciences (LAS). Any


  1. In Automation we Trust? Identifying Factors that Influence Trust and Reliance in Automated and Human Decision Aids This material is based upon work supported in whole or in part with funding from the Laboratory for Analytic Sciences (LAS). Any opinions, findings, conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the LAS and/or any agency or entity of the United States Government.

  2. Research Team William A. Boettcher Joseph M. Simons-Rudolph Associate Professor Teaching Assistant Professor Political Science Psychology Roger C. Mayer Sean M. Streck Professor of Management, Innovation & Researcher Entrepreneurship Laboratory for Analytic Sciences Poole College of Management Christopher B. Mayhorn Carl J. Pearson & Allaire K. Welk Professor Ph.D. Students Psychology Psychology

  3. The Problem: An Increase in the Prevalence of Automated Systems • As automated systems are integrated with tasks humans must perform, they alter the nature of the human’s job • Can require operators to allocate critical attentional resources to unexpectedly presented messages under conditions of high demand • The unexpected information may not be presented in isolation - operators may receive information from human and automated sources, and in some instances this information may conflict

  4. Reliance in Information Sources • Humans tend to rely on available information while completing complex tasks. • But what happens when information is presented by human and automated sources? And what happens if those information sources conflict? • This situation occurs more than you might think – An example: A Russian passenger jet and cargo plane in 2002 crashed in a mid-air collision. Automation told the two planes to change elevation in different directions, but so did an air traffic controller; these two messages directly conflicted in terms of the directions delivered to the pilots. One pilot listened to the automation and the other pilot listened to an air traffic controller, and the planes collided.

  5. Automation vs. Human Reliance • Little work has empirically examined the factors that influence how humans prioritize and trust information from automated and human sources when both are available AND directly conflict. • One related study (Lyons & Stokes, 2012)...

  6. Inspiration Lyons and Stokes (2012) found that humans rely on human sources less in situations of high risk than in situations of low risk. Concerns with Lyons & Stokes (2012): • To manipulate risk, the human source’s consistency with an automated tool was manipulated – the human recommended the route that the automation deemed most dangerous (within-subject design). • This within-subject inconsistency could affect trust and subsequent reliance. • Limited statistical power (n=40) • No time pressure • Trust in automation/human source was not measured

  7. The Current Study Goal: Generalize and expand previous work in the area Research Question 1: How does the order of information presentation affect reliance on automation and human sources in a risky complex decision making task with time pressure? Research Question 2: How does perceived risk influence trust in automation and human sources within that task? Research Question 3: How does perceived workload influence trust in automation and human sources within that task?

  8. The Current Study: Overview Manipulation: • Information presentation: Sequential/concurrent presentation of human/automated sources Measures: • NASA TLX: Subjective workload • Perceived Risk: 7-point Likert scale • Interpersonal Trust: Trust in a human source • Automation Trust: Trust in an automated source • Reliance Participants: • 126 undergraduate participants • Mean age: 19 years old • Gender: 66 males and 60 females

  9. Measures: NASA Task Load Index (TLX) • Used to measure workload in a task across different dimensions • Empirically validated by Hart & Staveland in 1988 after development at NASA – Mental Demand – Physical Demand – Temporal Demand – Performance – Effort – Frustration • Example Question: “ How much mental and perceptual activity was required?“ • Response to dimensions on 15-point Likert scale, then calculated to composite score

  10. Measures: Mayer Interpersonal Trust scale • Used to measure interpersonal trust in human sources • An Integrative Model of Organizational Trust , developed by Mayer, Davis, Schoorman (1995) • Four antecedents of trust – Propensity to Trust – Ability – Benevolence – Integrity • Example question: “I feel very confident about the human’s skills.” • Agreement with statement on 5 point Likert scale • 21 questions, one question dropped from original based on lack of relevance to our study

  11. Measures: Bisantz & Seong Automation Trust scale • Assessment of operator trust in and utilization of automated decision-aids under different framing conditions , developed by Bisantz and Seong (2001) – Example Question: “I can trust the system” • “System” was changed to “map” for relevance to our experiment • Responses recorded on a 5 point Likert Scale • 10 questions

  12. Method: Decision Making Task Participants must select a route for their military convoy from three possible options. • An automated tool provides a map that contains information regarding past IED explosions and insurgent activity to illustrate one optimal route choice. • The human provides information that conflicts with the map and recommends a different route.

  13. Method: Decision Making Task Instructions: “You will be performing as the leader of a vehicle convoy. Your mission is to deliver critical supplies to a nearby warehouse. Your task will be to select a delivery route. You will be shown a map displaying three delivery routes. The map will identify the location(s) of past IEDs (Improvised Explosive Devices), as well as areas of insurgent activity. You will also receive information from a local intelligence officer who will provide you with additional data about the area. Consider the three routes and select one . Make your decision as quickly as possible; you will have 60 seconds to complete this task.”

  14. Method: Automated Decision Tool • Route choices are numbered • Red shaded area represents past insurgent activity • Red solid marks are past IEDs

  15. Method: Human Decision Aid Human decision aid provides explicit recommendation of route choice

  16. Method: Reliance Measure Which route do you select? • Route 1 • Route 2 (rely on automation) • Route 3 (rely on human)

  17. Method: Procedure • Stimulus Presentations 3 possible orders • • Route decision (1, 2, or 3) • Automation Trust • Interpersonal Trust • NASA TLX (Workload) Risk Likert Scale • • Time pressure

  18. Results: Information Presentation & Reliance Research Question 1: Information presentation order did not significantly affect reliance Logistic regression: Non-significant, p = .280 No significant differences in reliance between groups

  19. Results: Information Presentation & Trust Information presentation order did not significantly affect trust in automation/human sources. Multivariate analysis of variance: IV: Presentation order DV’s: Interpersonal and automation trust Non-significant multivariate effect: p = .403 Non-significant univariate effects: Trust in automation, p = .388, interpersonal trust, p = .195

  20. Results: Interpersonal Trust Research Question 2: Perceived risk positively predicted human trust. As perceived risk increased, trust in the human decision aid increased

  21. Results: Trust in Automation Research Question 3: Perceived workload negatively predicted automation trust. As workload increased, trust in the automated decision aid decreased

  22. Results: Interpersonal Trust Antecedents To further investigate elements of interpersonal trust that were influenced, we analyzed the trust antecedents separately . Predictor Variables: • Dispositional trust (individual difference) • Perceived risk • Workload Four Antecedents (Outcome Variables) : • Propensity to Trust • Ability • Benevolence • Integrity Perceived risk and dispositional trust significantly and positively predicted every trust antecedent.

  23. Conclusions • Presentation order of the information sources did not affect trust nor reliance. • Increased workload negatively affected trust in automation. • Increased risk positively affected trust in the human. • Dispositional trust and perceived risk consistently predicted interpersonal trust antecedents (ability, integrity, benevolence, propensity to trust). • Trust and reliance are two distinct constructs with unique predictors.

Recommend


More recommend