Human error


Human error is an action that has been done but that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". Human error has been cited as a primary cause and contributing factor in disasters and accidents in industries as diverse as nuclear power, aviation, space exploration, and medicine. Prevention of human error is generally seen as a major contributor to reliability and safety of systems. Human error is one of the many contributing causes of risk events.

Definition

Human error refers to something having been done that was "not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits". In short, it is a deviation from intention, expectation or desirability. Logically, human actions can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate ; or, the plan can be satisfactory, but the performance can be deficient. However, a mere failure is not an error if there had been no plan to accomplish something in particular.

Performance

Human error and performance are two sides of the same coin: "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight: therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study of absent-mindedness in everyday life provides ample documentation and categorization of such aspects of behavior. Having a sense of awareness is needed to understand when dealing with a potential danger, thus being able to correct it. While human error is firmly entrenched in the classical approaches to accident investigation and risk assessment, it has no role in newer approaches such as resilience engineering.

Categories

There are many ways to categorize human error:
  • exogenous versus endogenous error
  • situation assessment versus response planning and related distinctions in
  • * error in problem detection
  • * error in problem diagnosis
  • * error in action planning and execution
  • by level of analysis; for example, perceptual versus cognitive versus communication versus organizational
  • physical manipulation error
  • *'slips' occurring when the physical action fails to achieve the immediate objective
  • *'lapses' involve a failure of one's memory or recall
  • active error - observable, physical action that changes equipment, system, or facility state, resulting in immediate undesired consequences
  • latent human error resulting in hidden organization-related weaknesses or equipment flaws that lie dormant; such errors can go unnoticed at the time they occur, having no immediate apparent outcome
  • equipment dependency error – lack of vigilance due to the assumption that hardware controls or physical safety devices will always work
  • team error – lack of vigilance created by the social interaction between two or more people working together
  • personal dependencies error – unsafe attitudes and traps of human nature leading to complacency and overconfidence

Controversies

Some researchers have argued that the dichotomy of human actions as "correct" or "incorrect" is a harmful oversimplification of a complex phenomenon. A focus on the variability of human performance and how human operators can manage that variability, may be a more fruitful approach. Newer approaches, such as resilience engineering mentioned above, highlight the positive roles that humans can play in complex systems. In resilience engineering, successes and failures are seen as having the same basis, namely human performance variability. A specific account of that is the efficiency–thoroughness trade-off principle, which can be found on all levels of human activity, in individuals as well as in groups.