Quantifying Suboptimal Automation Use

Author: Megan Bartlett

Bartlett, Megan, 2018 Quantifying Suboptimal Automation Use, Flinders University, College of Education, Psychology and Social Work

Terms of Use: This electronic version is (or will be) made publicly available by Flinders University in accordance with its open access policy for student theses. Copyright in this thesis remains with the author. You may use this material for uses permitted under the Copyright Act 1968. If you are the owner of any included third party copyright material and/or you believe that any material has been made available without permission of the copyright owner please contact copyright@flinders.edu.au with the details.

Abstract

Advances in sensors and information processing algorithms offer the prospect of powerful automated decision aids to assist human operators in fields such as transportation security, military operations, and medical diagnosis. A decision aid can improve human performance, however, only if the user acts appropriately on its advice. The current thesis investigated elements of automation interface design, individual differences, decision strategies, and training protocols that shape human-automation interaction. The thesis comprises four papers (two published, one under review, and one in preparation).

The first study investigated the differential effects of automation misses and false alarms on operator behavior, asking whether they reflect a tendency for operators to prefer automation whose response bias matches their own. Participants performed a simulated baggage screening task either unassisted or with assistance from a 95%-reliable automated decision aid. The response bias of the automation and the participant were manipulated orthogonally. Contrary to earlier findings, data gave no evidence that false alarms from the aid compromised human performance more than misses did, suggesting that this effect might be less robust than earlier evidence suggested.

The second study measured the efficiency of operators’ automation use, comparing automation-aided performance to the predictions of various statistical models of collaborative decision making. Across three experiments, participants performed a binary signal detection task either unassisted, or with assistance from a 93%-reliable automated decision aid. As anticipated, assistance from the aid improved discrimination performance. However, aided performance was consistently and highly suboptimal, hewing closest to the predictions of some of the least efficient collaborative models. Performance was similar whether the aid provided binary cues or more informative graded judgments.

The third study investigated whether manipulating the format in which the aid’s cues were rendered would improve automation-aided performance. Across three experiments, participants performed the same binary signal detection task as in the second study, with judgments from the aid rendered either as raw signal levels, confidence ratings, likelihood ratios, verbal descriptors, or verbal-spatial descriptors. While assistance from the aid again improved participants’ discrimination performance, aided performance remained highly inefficient regardless of the format of the aid’s judgments.

The final study sought to generalize the above effects across varying levels of aid reliability. Participants performed the same task as in the second and third studies, but with an aid of either 60%, 85%, or 96% reliability. Assistance from an 85% or 96%-reliable decision aid, relative to assistance from a 60%- reliable aid improved automation-aided sensitivity. Interestingly, automation-aided sensitivity approached optimal levels as aid reliability level decreased, but only because the asymptotically optimal strategy as the aid’s reliability goes to zero is to ignore the aid’s judgments.

Keywords: Automation, Human Operator, Human-Automation Interaction, Use, Agreement, Trust, Payoffs, Signal Detection Theory, Decision-Making Strategies, Cues, Information Integration, Reliability.

Subject: Psychology thesis

Thesis type: Doctor of Philosophy
Completed: 2018
School: College of Education, Psychology and Social Work
Supervisor: Professor Jason McCarley