Content area
Full Text
Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.
INTRODUCTION
Sophisticated automation is becoming ubiquitous, appearing in work environments as diverse as aviation, maritime operations, process control, motor vehicle operation, and information retrieval. Automation is technology that actively selects data, transforms information, makes decisions, or controls processes. Such technology exhibits tremendous potential to extend human performance and improve safety; however, recent disasters indicate that it is not uniformly beneficial. On the one hand, people may trust automation even when it is not appropriate. Pilots, trusting the ability of the autopilot, failed to intervene and take manual control even as the autopilot crashed the Airbus A320 they were flying (Sparaco, 1995). In another instance, an automated navigation system malfunctioned and the crew failed to intervene, allowing the Royal Majesty cruise ship to drift off course for 24 hours before it ran aground (Lee & Sanquist, 2000; National Transportation Safety Board, 1997). On the other hand, people are not always willing to put sufficient trust in automation. Some operators rejected automated controllers in paper mills, undermining the potential benefits of the automation (Zuboff, 1988). As automation becomes more prevalent, poor partnerships between people and automation will become increasingly costly and catastrophic.
Such flawed partnerships between automation and people can be described in terms of misuse and disuse of automation...