Supporting Trust Calibration and Attention Management in Human-Machine Teams

Trust plays a critical role in human-machine teaming. Poor trust calibration, i.e., a lack of correspondence between a person’s trust in a system and its actual capabilities, leads to inappropriate reliance on, or rejection of the technology. Trust also affects attention management and monitoring of highly autonomous systems. Overtrust results in excessive neglect time (the time the machine agent operates without human intervention) while undertrust makes operators spend too much time supervising a system, at the cost of performing other tasks. Inappropriate trust levels and resulting breakdowns in attention control and resource allocation represent major challenges for human-machine collaboration.

Skills

Posted on

November 27, 2019