systems thinking and training

Boeing 737 MAX

I read an article in New Republic entitled Crash Course by Maureen Tkacic, a former Wall Street Journal reporter, which describes how “Boeing’s managerial revolution created the 737 MAX disaster” — resulting in plane crashes in Indonesia and Ethiopia.

In the now infamous debacle of the Boeing 737 MAX, the company produced a plane outfitted with a half-assed bit of software programmed to override all pilot input and nosedive when a little vane on the side of the fuselage told it the nose was pitching up. The vane was also not terribly reliable, possibly due to assembly line lapses reported by a whistle-blower, and when the plane processed the bad data it received, it promptly dove into the sea.

In the article by Tkacic, all the blame is on Boeing.

The upshot was that Boeing had not only outfitted the MAX with a deadly piece of software; it had also taken the additional step of instructing pilots to respond to an erroneous activation of the software by literally attempting the impossible. MCAS alone had taken twelve minutes to down Lion Air 610; in the Ethiopian crash, the MCAS software, overridden by pilots hitting the cutout switches as per Boeing’s instructions, had cut that time line in half. Lemme had seen a lot of stupidity from his old employer over the years, but he found this whole mess “frankly stunning.”

When I shared this article on Twitter, Jim Hays referred me to another article in the New York Times by William Langewiesche, an experienced pilot and aviation journalist who has written technical reports on the flight characteristics of various airplanes. It is entitled — What really brought down the Boeing 737 Max?

Note: I am only comparing these two articles, not making my own uneducated investigation into this aircraft.

Langewiesche discusses a number of factors at play, and in my read of his article, Boeing is one of the least culpable, though still responsible, actors. The pilot training system outside North America and Europe played a significant role it seems.

Dave Carbaugh, the former Boeing test pilot, spent his first 10 years with the company traveling the globe to teach customers how to fly its airplanes. He mentioned the challenge of training pilots in Asia. “Those were the rote pilots,” he said, “the guys standing up in the back of a sim. They saw a runaway trim. They saw where and how it was handled in the curriculum — always on Sim Ride No. 3. And so on their Sim Ride No. 3, they handled it correctly, because they knew exactly when it was coming and what was going to happen. But did they get exposed anywhere else? Or did they discuss the issues involved? No. It was just a rote exercise. This is Step No. 25 of learning to fly a 737. Period.” I asked about China specifically. He said: “The Chinese? They were probably the worst.” He spent every other month in China for years. He said: “They saw flying from Beijing to Tianjin as 1,352 steps to do. Yet if they flew from Beijing to Guangzhou, it was 1,550 steps. And they didn’t connect the two. It would get so rote that they just wouldn’t deviate. I remember flying with a captain who would never divert no matter how many problems I gave him. I asked him, ‘How come?’ He said, ‘Because the checklist doesn’t say to divert.’

Pilots lacking ‘airmanship’ — “a visceral sense of navigation, an operational understanding of weather and weather information, the ability to form mental maps of traffic flows, fluency in the nuance of radio communications and, especially, a deep appreciation for the interplay between energy, inertia and wings” — are good for 98% of situations but useless in the critical 2%.

Boeing became the world’s pre-eminent commercial airplane manufacturer in part because it developed a coherent design philosophy that relied on pilots’ airmanship as the last line of defense. It made sense in an era when airplanes were vulnerable to weather and prone to failures and pilots intervened regularly to keep airplanes from crashing.

The automation of aircraft systems results in safer flights — most of the time.

The paradox is that the failures of the 737 Max were really the product of an incredible success: a decades-long transformation of the whole business of flying, in which airplanes became so automated and accidents so rare that a cheap air-travel boom was able to take root around the world. Along the way, though, this system never managed to fully account for the unexpected: for the moment when technology fails and humans — a growing population of more than 300,000 airline pilots of variable and largely unpredictable skills — are required to intervene. In the drama of the 737 Max, it was the decisions made by four of those pilots, more than the failure of a single obscure component, that led to 346 deaths and the worldwide grounding of the entire fleet.

The Training Challenge

My interest in these stories is how training can be either part of the problem or part of the solution. Obvious training oversight issues, such as allowing pilots to stand in the back of a simulator and have this time counted as time flying the simulator, should be addressed. But there is still little consensus, based on research, showing exactly how flight simulation should be employed. I know, I started researching this in the mid-1990’s. This is definitely an area that requires more research by those who purport to be experts in human learning. Just checking-the-box continues to be all too prevalent in training systems.

As more of our work systems become automated, there is less need for vigilant human oversight. Most commercial aircraft fly most of the time on autopilot. What does this do to pilot concentration and skill degradation? Perhaps pilots need to spend even more time in simulators practicing for those 2% of situations. This of course will cost the airlines more.

Training advisors today need a comprehensive view of the performance systems they are supporting. Simulator training is only part of the issue. Classroom training that promotes rote learning results in rote pilots. Changes in aircraft design need an understanding of all the resulting effects and perhaps changes in the regulations for simulation time, checklists, or procedures. Automation, in all fields, will force learning and development out of the comfort zone of course development and into the most complex aspects of human learning and performance — or face irrelevance.

2 Responses to “systems thinking and training”

  1. Christopher Mackay

    Prediction: training advisors who don’t see themselves as involved in life-&-death scenarios will happily ignore all of this. (Pretty sure Upton Sinclair originally said this.)

  2. Harold Jarche

    Too true, Chris!

    “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!” —Upton Sinclair


Leave a Reply

  • (will not be published)