Module 4 - Human Error

OBJECTIVE:

The participant will be able to describe the Human Factors Analysis and Classification System (HFACS).

TRAINING TIME:

30 Minutes

KEY TEACHING POINTS:

  • ERROR CLASSIFICATION
  • LATENT CONDITIONS
  • HFACS
  • COUNTERMEAURES

[Slide #1 at opening] LINK

Earlier we spoke about organizational factors. The influence that organizations have human performance is beginning to take root in aviation circles. Building on the work of James Reason, Scott Shappell and Douglas Wiegmann developed the Human Factors Analysis and Classification System (HFACS).

AIM

The purpose of this module is to familiarize you with this HFACS model.

MOTIVATION

The HFACS model sheds light on the factors and conditions that can lead to accidents. It provides us with insight into the role played by various actors in contributing to an organizational accident. This insight, it is hoped, will help you recognize the organizational symptoms in order for you to then develop appropriate countermeasures.

[Slide #2] OUTLINE

  1. Error Classification
  2. Latent Conditions
  3. Countermeasures

[Slide #3]

Human Factors Analysis and Classification System (HFACS), by Scott Shappell and Douglas Wiegmann 1.  Error Classification

Drawing upon the work of Dr. James Reason (1990), Scott Shappell and Douglas Wiegmann developed the Human Factors Analysis and Classification System (HFACS) where human error can be described at each of four levels:

  1. Unsafe acts of aircrew;
  2. Preconditions for unsafe acts;
  3. Unsafe supervision; and
  4. Organizational influences.

Unsafe acts of aircrew

The unsafe acts of pilots can be loosely classified into one of two categories:

  • Errors; and
  • Violations.

(While both are common within most settings, they differ markedly when the rules and regulations of an organization are considered. That is, errors can be described as those "legal" activities that fail to achieve their intended outcome, while violations are commonly defined as behaviour that represents the willful disregard for the rules and regulations.

Three types of errors are identified

  • Decision;
  • Skill-based; and
  • Perceptual.

Violations, on the other hand are classified as either:

  • Routine; or
  • Exceptional.

Errors

One of the more common error forms, decision errors, represents conscious, goal-intended behaviour that proceeds as designed; yet, the plan proves inadequate or inappropriate for the situation. Often referred to as "honest mistakes," these unsafe acts typically manifest as poorly executed procedures, improper choices, or simply the misinterpretation or misuse of relevant information.

In contrast to decision errors, the second error form, skill-based errors, occurs with little or no conscious thought. Just as little thought goes into turning one's steering wheel or shifting gears in an automobile, basic flight skills such as stick and rudder movements and visual scanning often occur without thinking. The difficulty with these highly practised and seemingly automatic behaviours is that they are particularly susceptible to attention and/or memory failures. As a result, skill-based errors, such as the breakdown in visual scan patterns, inadvertent activation/deactivation of switches, forgotten intentions, and omitted items in checklists, often appear. Even the manner (or skill) with which one flies an aircraft (aggressive, tentative, or controlled) can affect safety.

While decision and skill-based errors have dominated most accident investigations, the third and final error form, perceptual errors, has received comparatively less attention. No less important, perceptual errors occur when sensory input is degraded, or "unusual," as is often the case when flying at night, in the weather, or in other visually impoverished environments. Faced with acting on imperfect or less information, pilots run the risk of misjudging distances, altitude, and descent rates, as well as responding incorrectly to a variety of visual/vestibular illusions.

Violations

Although there are many ways to distinguish among types of violations, two distinct forms have been identified. The first, routine violations, tends to be habitual by nature and is often enabled by a system of supervision and management that tolerates such departures from the rules. Often referred to as "bending the rules," the classic example is that of the individual who drives his/her automobile consistently 5-10 kph faster than allowed by law. While clearly against the law, the behaviour is, in effect, sanctioned by local authorities, which will often not enforce the law until speeds in excess of 10 kph over the posted limit are observed.

[Slide #4] Exceptional violations, on the other hand, are isolated departures from authority, neither typical of the individual nor condoned by management. For example, while authorities, driving 110 kph in, might condone driving 110 kph in a 100-kph zone a posted 30-kph-school zone certainly would not. It is important to note that while most exceptional violations are appalling, they are not considered "exceptional" because of their extreme nature. Rather, they are regarded as exceptional because they are neither typical of the individual nor condoned by authority.

[Slide #5] 2.  Latent Conditions

According to Shappell and Wiegmann, the unsafe acts of aircrew are shaped by three large organizational factors:

  • Preconditions for unsafe acts;
  • Unsafe supervision; and
  • Organizational influences.

Preconditions for unsafe acts

Simply focusing on unsafe acts, however, is like focusing on a patient's symptoms without understanding the underlying disease state that caused them. As such, investigators must dig deeper into the preconditions for unsafe acts.

Two major subdivisions of unsafe acts are:

  • The substandard conditions; and
  • The substandard practices.

Substandard conditions of the aircrew

Being prepared mentally is critical in nearly every endeavour; even more so in aviation. With this in mind, the first of three categories, adverse mental states, was created to account for those mental conditions that adversely affect performance. Principle among these are the loss of situational awareness, mental fatigue, circadian dysrhythmia, and pernicious attitudes, such as overconfidence, complacency, and misplaced motivation, that negatively affect decisions and contribute to unsafe acts.

Equally important, however, are those adverse physiological states that preclude the safe conduct of flight. Particularly important to aviation are conditions such as spatial disorientation, visual illusions, hypoxia, illness, intoxication, and a whole host of pharmacological and medical abnormalities known to affect performance. For example, it is not surprising that, when aircrews become spatially disoriented and fail to rely on flight instrumentation, accidents can, and often do, occur.

Physical and/or mental limitations of the pilot, the third and final category of substandard condition, include those instances when necessary sensory information is either unavailable or, if available, individuals simply do not have the aptitude, skill, or time to safely deal with it. For aviation, the and/or contrast of the object in the visual field. However, there are many times when a situation requires such rapid mental processing or reaction time that the time allotted to remedy the problem exceeds human limits (as is often the case during nap-of-the-earth flight). Nevertheless, even when favourable visual cues or an abundance of time is available, there are instances when an individual simply may not possess the necessary aptitude, physical ability, or proficiency to operate safely.

Substandard practices of the aircrew

Often times, the substandard practices of aircrew will lead to the conditions and unsafe acts described earlier. For instance, the failure to ensure that all members of the crew are acting in a co ordinate manner can lead to confusion (adverse mental state) and poor decisions in the cockpit. Crew resource and intra-cockpit communication, as well as communication with ATC and other do not work together as a team, or when individuals directly responsible for the conduct of operations fail to co ordinate activities, before, during and after a flight.

Equally important, however, individuals must ensure that they are adequately prepared for the flight. Consequently, the category of personal readiness was created to account for those instances when rules, such as disregarding crew rest requirements, violating alcohol restrictions, or self-medicating, are not adhered to. However, even behaviours that do not necessarily violate existing rules or regulations (e.g., running ten miles before piloting an aircraft or not observing good dietary practices) may reduce the operating capabilities of the individual and are, therefore, captured here.

Unsafe supervision

Clearly, aircrews are responsible for their actions and, as such, must be held accountable. However, in many instances, they are the unwitting inheritors of latent failures attributable to those who supervise them. Failures included inadequate supervision, planned inappropriate operations, failure to correct known problems, and supervisory violations.

Inadequate supervision refers to failures within the supervisory chain of command that were a direct result of some supervisory action or inaction. In other words, supervisors must at least provide the opportunity for individuals to succeed. It is expected, therefore, that individuals will receive adequate training, professional guidance, oversight, and operational leadership, and that all will be managed appropriately. When this is not the case, aircrews are often isolated, as the risk associated with day-to-day operations invariably will increase.

However, the risk associated with supervisory failures can come in many forms. Occasionally, for example, the operational tempo and/or schedule is planned such that individuals are put at unacceptable risk and, ultimately, performance is adversely affected. As such, planned inappropriate operations account for all aspects of improper or inappropriate crew scheduling and operational planning, which may focus on such issues as crew pairing, crew rest, and managing the risk associated with specific flights.

The remaining two categories of unsafe supervision, the failure to correct known problems and supervisory violations, are similar, yet considered separately. The failure to correct known problems refers to those instances when deficiencies among individuals, equipment, training, or other related safety areas are known to the supervisor but are allowed to continue uncorrected. For example, the failure to consistently correct or discipline inappropriate behaviour certainly fosters an unsafe atmosphere but is not considered a violation if no specific rules or regulations were broken.

Supervisory violations, on the other hand, are reserved for those instances when supervisors when managing assets willfully disregard existing rules and regulations. For instance, permitting aircrews to operate aircraft without current qualifications or licences is a flagrant violation that invariably sets the stage for the tragic sequence of events that predictably follows.

Organizational influences

Fallible decisions of upper-level management can directly affect supervisory practices as well as the conditions and actions of the crews. Unfortunately, these organizational influences often go unnoticed or unreported by even the best-intentioned accident investigators.

Traditionally, these latent organizational failures generally revolve around three issues:

  • resource management;
  • organizational climate; and
  • operational processes.

Resource management refers to the management, allocation, and maintenance of organizational resources, including human resource management (selection, training, staffing), monetary safety budgets, and equipment design (ergonomic specifications). In general, corporate decisions about how such resources should be managed centre around two distinct objectives — safety and on-time, cost-effective operations. In times of prosperity, both objectives can be easily balanced and satisfied in full. However, there may also be times of fiscal austerity that demand some give and take between the two. Unfortunately, history tells us that safety is often the loser in such battles because safety and training are often the first to be cut in organizations experiencing financial difficulties.

Organizational climate refers to a broad class of organizational variables that influence workers' performance and is defined as the situational-based consistencies in the organization's treatment of individuals One telltale sign of an organization's climate is its structure, as reflected in the chain-of-command, delegation of authority and responsibility, communication channels, and formal accountability for actions. Just like in the cockpit, communication and co ordination are vital within an organization. However, an organization's policies and culture are also good indicators of its climate. Consequently, when policies are ill defined, adversarial, or conflicting or when they are supplanted by unofficial rules and values, confusion abounds, and safety suffers within an organization because of human errors.

Finally, operational process refers to formal processes (operational tempo, time pressures, production quotas, incentive systems, schedules, etc.), procedures (performance standards, objectives, documentation, instructions about procedures, etc.), and oversight within the organization (organizational self-study, risk management, and the establishment and use of safety programs). Poor upper-level management and decisions concerning each of these organizational factors can also have a negative, albeit indirect, effect on personnel, individual performance and system safety.

What are the inferences that can be made from the term pilot error?

Human error — when the outcome of something we did or didn't do is not what we expected.

  • Selected the wrong switch
  • Used the wrong procedure
  • Overlooked something, failing to spot a problem
  • Forgot to pass on a message
  • Sent the wrong information

Human Error (cause identified)

  • air traffic controller error (ATC)
  • improper loading of the aircraft (Loader)
  • fuel contamination (Refueller)
  • improper maintenance (aircraft maintenance engineer (AME))

Why is the term pilot error distinct from human error?

The term pilot error is a statement of liability or causal factor in the occurrence. Finding that the pilot was a fault can infer skill deficiency and suggests that further training is required.

Pilot error can often overshadow human error (e.g., maintenance)

Pilot error can preclude the fixing of a faulty system, rather than a skill deficiency; it may be required to fix a faulty system." The occurrence could have been the result of inadequate training or "improper" maintenance.

The causal factor being pilot error usually terminated the investigation. It didn't threaten the administration, and often shut the door to addressing the real fault or problem.

There was a time in the investigation of aircraft accidents and near-accidents when the cause would often be attributed to "pilot error." Even correct, this diagnosis wasn't very helpful in preventing other accidents. It wasn't until recently that investigators began to ask why the pilot erred in the first place. It didn't take long to notice that bad decisions were often the cause and to realize that if pilots could be taught to make better decisions, they would be safer pilots. Pilot error is not the cause of an accident. The cause is to be found in whatever it was that interfered with the pilot's decision at a critical moment, the outcome of which was pilot error.

Human errors contribute significantly to aviation accidents. Stressors like overwork, boredom, noise, heat distractions, deadlines, and fatigue are not the whole story, although they expose weaknesses in procedures, documentation, communication, and decision-making processes, to name a few.

Even the most competent and well-motivated individuals make errors.

Aren't some people prone to error?

Some people are more prone to make certain kinds of errors or are more vulnerable to particular risk factors, but nobody is perfect. It is unrealistic to rely on those who are less vulnerable to always avoid error; it is only a matter of degree. So if to err is human isn't error inevitable? You don't have to learn by trial and error, or just hope for the best. A lot of good practices that can reduce the risk of human error exist, such as PDM. Just identify what is relevant to your work and apply it.

What is a good practice?

Working practices and procedures designed to account for natural human performance limitations and vulnerabilities.

Creating working conditions that minimize the effects of those limitations and vulnerabilities.

[Slide #6] Good practices to address many commonplace risk factors have been developed. Here are examples:

  • thermal comfort
  • checklist design
  • interruption management
  • inspection and checking design
  • spoken protocols
  • acoustic environment
  • human-machine interface
  • computer displays
  • team composition
  • instruction structure
  • managing concurrent activities
  • shift working.

REVIEW/LINK

In this module, we have attempted to examine error in the context of the organizations pilots work in.

In the next module, we will explore how we can use risk management to further mitigate these and other factors.