Computerized system monitors and decision aids are increasingly common additions to critical decision-making contexts such as intensive care units, nuclear power plants and aircraft cockpits. These aids are introduced with the ubiquitous goal of &&reducing human error''. The present study compared e
Accountability and automation bias
โ Scribed by LINDA J. SKITKA; KATHLEEN MOSIER; MARK D. BURDICK
- Publisher
- Elsevier Science
- Year
- 2000
- Tongue
- English
- Weight
- 210 KB
- Volume
- 52
- Category
- Article
- ISSN
- 1071-5819
No coin nor oath required. For personal study only.
โฆ Synopsis
Although generally introduced to guard against human error, automated devices can fundamentally change how people approach their work, which in turn can lead to new and di!erent kinds of error. The present study explored the extent to which errors of omission (failures to respond to system irregularities or events because automated devices fail to detect or indicate them) and commission (when people follow an automated directive despite contradictory information from other more reliable sources of information because they either fail to check or discount that information) can be reduced under conditions of social accountability. Results indicated that making participants accountable for either their overall performance or their decision accuracy led to lower rates of &&automation bias''. Errors of omission proved to be the result of cognitive vigilance decrements, whereas errors of commission proved to be the result of a combination of a failure to take into account information and a belief in the superior judgement of automated aids.
2000 Academic Press
๐ SIMILAR VOLUMES