As new automation systems provide the user with increasing levels of support, the line differentiating what the user is responsible for and the automation is responsible for can begin to blur. If the user is held responsible for safe outcomes in the operation, then the user needs to have an appropriate level of control over the automation. The automation must effectively communicate to the user the relevant information that the user needs to assess and validate the information provided by the automation. How many decision making tasks and control actions can automation take over, while the user is still held responsible for safe outcomes?
In some automation systems, users retain the ultimate responsibility for safety, while automation makes the majority of safety critical decisions. Users may not be successful serving as the ultimate safety barrier if increased levels of automation have reduced their overall situational awareness. This can be especially true in cases where the automation fails or when there is an unusual operational situation beyond the automation’s capabilities.
Automation designers should ensure that the decision to use automation for a decision task or a control task is evaluated with respect to the consequences for human performance in the operational environment. Safety critical automation systems should always communicate to the user what it is doing and sometimes more importantly, what it is not doing. Training and operational procedures should clearly define the roles and responsibilities of the human users. The users should understand the limitations and failure modes of the automation and they should have practice and confidence with manual operations so they are able to act as a safety backup.
Story Title | Source |
---|---|
Common Automated Radar Terminal System (CARTS) Datablock Drop and Ghosting Issues | FAA |
Tower Data Link Services (TDLS) | FAA |
User Request and Evaluation Tool (URET) | FAA |
The Cognitive Costs and Benefits of Automation – Richard Breton & Eloi Bosse
http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA422303Humans and Automation: Use, Misuse, Disuse, and Abuse – Raja Parasuraman & Victor Riley
http://archlab.gmu.edu/people/rparasur/Documents/ParasRileyHF97.pdfHuman Factors Issues Relevant to Automation Design: Insights from Research on Uninhabited Autonomous Vehicles – Daniel Baquero, Wendy Rogers, & Arthur Fisk
https://smartech.gatech.edu/bitstream/handle/1853/40565/HFA-TR-0909-HumanFactors-UAVs-to-OrchardMowing.pdfHumans and Automation: System Design and Research Issues – Thomas Sheridan
http://www.hfes.org/publications/ProductDetail.aspx?ProductID=6Does automation bias decision-making – Linda Skitka, Kathleen Mosier, & Mark Burdick
https://www.uic.edu/labs/skitka/public_html/AutoBias.pdfHow to Make Automated Systems Team Players – Klaus Christoffersen & David Woods
http://csel.eng.ohio-state.edu/productions/xcta/downloads/automation_team_players.pdf