The user must be able to trust the automation to perform its intended functions to gain user acceptance and system benefit.

Automation users should cultivate a balance between trusting the automation enough without trusting the automation so much that they stop paying attention to the situation. How can such a balance be achieved in operations? Automation that is reliable, accurate and designed with the support of the users should be able to gain user trust.

Why It's Important

User overreliance (too much trust) in an automation system may lead to a reduction in effective monitoring and a diminished ability to manually intervene when needed. In cases where the user has become overly reliant on automation, overall system safety can be adversely impacted as the human safety barrier has become less effective. Conversely, low user trust in an automation system may lead to disuse of the system and thereby the loss the automation’s intended benefits. The user workload may also be increased from manually completing the tasks that automation was designed to complete.

Design and Implementation Considerations

A user ‘trust balance’ can be achieved through a holistic approach to system design, operational procedures and training. High rates of false alarms and nuisance alarms can further reduce user trust in a system. When users feel like an automation system is intuitive and that it reliably helps them accomplish their operational goals, they will be more likely to trust the system. Including users early in the requirements development process is likely to boost user trust in the system because they may place more trust in the automation designers and the goals of the system.

Further Reading on User Trust

Trust in Automation: Designing for Appropriate Reliance – John D. Lee and Katrina A. See

Human Factors Design Guide Update: Automation Guidelines – Vicki Ahlstrom, Kelly Longo, & Todd Truitt Factors Design Guide Update (CT9601) A Revision to Chapter 5 Automation Guidelines.pdf

NextGen Automation Requirements Categories – T.J Sharkey, L.L. Loomis, & P.A. Lakinsmith Categories for NextGen Human Automation Requirements_RevG[1].pdf