crew resource management

5 results back to index

pages: 175 words: 54,028

Fly by Wire: The Geese, the Glide, the Miracle on the Hudson by William Langewiesche


Air France Flight 447, Airbus A320, airline deregulation, Bernard Ziegler, Captain Sullenberger Hudson, crew resource management, New Journalism, US Airways Flight 1549, William Langewiesche

Their front man said, “Good morning, Captain Sullenberger, but all of our questions have been answered by Captain Sullenberger, the technical panel, and the other party members. Thank you, sir.” Sullenberger said, “Thank you.” The engine manufacturer had no questions. US Airways had no questions. The pilots’ union representative wanted to get back to crew resource management. There wasn’t much to say. In fact, if you wanted to pick one accident in which elaborations on teamwork don’t need to be made, this would be a good one to choose. It was I’ll fly the airplane, you try to restart the engines. But crew resource management has become a central dogma, the sine qua non of airline flying, and because Sullenberger’s landing had been successful, it seemed necessary to mix it in now. Sullenberger was willing to try. The union man asked him to describe his use of CRM that day, and Sullenberger said, “We had a crew briefing at the beginning of the trip, on Monday, January 12, where we aligned our goals, we talked about a few specifics, set the tone, and opened our channels of communication.

He is not. But in this interview he was completely frank. When asked about teamwork in the cockpit during the glide, he said there was little need for it, and little was involved: he had started into the checklist to restart the engines, and Sullenberger had done the flying. The division was plain and simple, and pretty obvious at the moment. You could arrive afterward and call it an exercise in Crew Resource Management—sorry, I mean CRM—if you insisted on fixing things up with formal language. CRM is indeed a useful term. Until recently it stood for Cockpit Resource Management and pertained only to pilots, until someone realized that the C could stand for Crew, allowing flight attendants into the program. Entire industries are built on this sort of progress. But frankly the glide had been very short, with no space for elaboration.

Nonetheless, he was clearly more aware of the political context than Skiles had been, and of course of the opportunities now suddenly arising. In retrospect, he was concentrating hard. When asked if the US Airways training had helped him to handle the emergency, he said absolutely it had, and he cited the principles of maintaining control, managing the situation, and (oddly, in this context) landing as soon as possible. He also credited the training in Crew Resource Management, the clear definition of duties, and the clear communication of plans. An investigator asked him how he liked working at US Airways. He answered that it is “a good company.” The investigator asked if the company exerted “external pressure” on the crews. The question, though poorly phrased, was an invitation to expound on the corporate culture of the airline. Sullenberger certainly understood this.

pages: 410 words: 114,005

Black Box Thinking: Why Most People Never Learn From Their Mistakes--But Some Do by Matthew Syed


Airbus A320, Alfred Russel Wallace, Arthur Eddington, Atul Gawande, Black Swan, British Empire, call centre, Captain Sullenberger Hudson, Checklist Manifesto, cognitive bias, cognitive dissonance, conceptual framework, corporate governance, creative destruction, credit crunch, crew resource management, deliberate practice, double helix, epigenetics, fear of failure, fundamental attribution error, Henri Poincaré, hindsight bias, Isaac Newton, iterative process, James Dyson, James Hargreaves, James Watt: steam engine, Joseph Schumpeter, Lean Startup, mandatory minimum, meta analysis, meta-analysis, minimum viable product, publication bias, quantitative easing, randomized controlled trial, selection bias, Silicon Valley, six sigma, spinning jenny, Steve Jobs, the scientific method, Thomas Kuhn: the structure of scientific revolutions, too big to fail, Toyota Production System, US Airways Flight 1549, Wall-E, Yom Kippur War

On the thirtieth page, in the dry language familiar in such reports, it offered the following recommendation: “Issue an operations bulletin to all air carrier operations inspectors directing them to urge their assigned operators to insure that their flight crews are indoctrinated in principles of flightdeck resource management, with particular emphasis on the merits of participative management for captains and assertiveness training for other cockpit crewmembers.” Within weeks, NASA had convened a conference to explore the benefit of a new kind of training: Crew Resource Management. The primary focus was on communication. First officers were taught assertiveness procedures. The mnemonic that has been used to improve the assertiveness of junior members of the crew in aviation is called P.A.C.E. (Probe, Alert, Challenge, Emergency).* Captains, who for years had been regarded as big chiefs, were taught to listen, acknowledge instructions, and clarify ambiguity. The time perception problem was tackled through a more structured division of responsibilities.

Time magazine listed him second in its section of Heroes & Icons in its TIME 100 of 2009.2 Academics hailed a new kind of authentic heroism amid a superficial celebrity culture. To the public it was an episode of sublime individualism; one man’s skill and calmness under pressure saving more than a hundred lives. But aviation experts took a different view. They glimpsed a bigger picture. They cited not just Sullenberger’s individual brilliance but also the system in which he operates. Some made reference to Crew Resource Management. The division of responsibilities between Sullenberger and Skiles occurred seamlessly. Seconds after the bird strike, Sullenberger took control of the aircraft while Skiles checked the quick-reference handbook. Channels of communication were open until the very last seconds of the flight. Skiles called out airspeeds and altitudes to provide his captain as much situational awareness as possible as the aircraft dropped.

Still others credited checklists and clever ergonomic design, both of which assisted the crew as the pressure intensified after the bird strike. This was a fascinating discussion, which largely took place away from the watching public. But even this debate obscured the deepest truth of all. Checklists originally emerged from a series of crashes in the 1930s. Ergonomic cockpit design was born out of the disastrous series of accidents involving B-17s. Crew Resource Management emerged from the wreckage of United Airlines 173. This is the paradox of success: it is built upon failure. It is also instructive to examine the different public responses to McBroom and Sullenberger. McBroom, we should remember, was a brilliant pilot. His capacity to keep his nerve as the DC8 careered down, flying between trees, avoiding an apartment block, finding the minimum impact force for a 90-ton aircraft hitting solid ground, probably saved the lives of a hundred people.

pages: 269 words: 74,955

The Crash Detectives by Christine Negroni

Air France Flight 447, Airbus A320, Captain Sullenberger Hudson, Checklist Manifesto, computer age, crew resource management, crowdsourcing, low cost carrier, Richard Feynman, Richard Feynman, South China Sea, Tenerife airport disaster, Thomas Bayes, US Airways Flight 1549

His cockpit resource management would teach pilots how to do this, in the same way businesses train their managers. “Pilots generally were well trained on aircraft systems and basic flying skills,” he said. But nothing was done to teach them what they needed to know for decision making, communication, and leadership. The Tenerife accident gave Lauber’s work new energy, and in the years to come, cockpit resource management would be changed to “crew resource management,” in recognition that other flight personnel such as mechanics, flight attendants, dispatchers, and air traffic controllers had a role to play in safe flights. As a bonus, the acronym, CRM, remained the same. “So many incidents in life as well as [in] other industries have broken down because of the ambiguity in communications,” said Christopher D. Wickens, a professor of psychology specializing in aviation human factors.

Anyway, Pearson points out, gliders have speed brakes, and without power the 767 he was flying did not. More to the point, he says it was all the flying he did that prepared him for that day. Gliders and airliners, for sure, but also aerobatic planes and ultralights, floatplanes and ski planes on ice and snow—decades of experiences all came flooding back, he told me. “There’s something to be gained from everything we do.” Synergy and Teamwork The philosophy of crew resource management, or CRM, is to merge each pilot’s separate strengths to create a more knowledgeable, more experienced team. With de Crespigny on QF-32 were First Officer Matt Hicks and Second Officer Mark Johnson. In what would prove to be fortuitous, two other captains, Dave Evans and Harry Wubben, were also on the flight deck. De Crespigny was being checked out on the A380, and the pilot checking him was being trained as a check captain (that is, learning how to assess whether a pilot meets government criteria).

See Central Intelligence Agency Charalambous, Pambos, 12, 14 checklists, 209, 210–11, 249 China Clipper, 72 Chippindale, Ron, 122, 125–26, 129 Chiu, José, 8, 92 Civil Aviation Division of New Zealand (CAD), 125, 127–28 Civil Aviation Safety Authority of Australia (CASA), 204, 206–7 Clipper Victor, 214–17 cockpit resource management (CRM), 218, 243, 252 cockpit voice recorders, 16, 84, 126–27, 130, 159, 163, 257 Coiley, David, 82–83 Colgan Flight 3406, 222–23 Collins, Jim, 118–24, 127–28 Colson, Charles, 85 Comet! The World’s First Jetliner (Simons), 157–8 communications technology, vii, 29, 52. See also radio communications and navigation; satellite technology composite materials, 190–91 “cone of ambiguity,” 52, 57 Conner, Ray, 182 conspiracy and coverup theories, x, 58, 62, 65, 124 Coward, John, 250–51 Cox, John, 126, 163, 256 Crandall, Robert, 170 crew resource management (CRM), 218, 243, 252 Crosby, John, 100, 102 CTC Wings of New Zealand, 199 Cummings, Missy, 228 Cupit, Zoe, 203–4 Currall, Bernie, 203 Dahn, Jeff, 175, 185, 193 Davis, Arthur, 109 Davis, Morrie, 124 DC-3, 223 DC-6, 8, 87–89, 92–95 DC-8, 97, 102, 103–4, 138 DC-9, 223 DC-10, x, 117–29, 138, 190, 244–45, 255 dead reckoning, 4 Deadly Departure (Negroni), 63–65, 164 de Crespigny, Richard, 211, 238–40, 243–44, 246–49, 253–54, 257–58 deHaven-Smith, Lance, 83–84 de Havilland Comet, 137–40, 141–51, 154, 156–58, 165–66, 188, 194, 208 Del E.

pages: 230 words: 71,320

Outliers: The Story of Success by Malcolm Gladwell


affirmative action, Bill Gates: Altair 8800, computer age, corporate raider, crew resource management, medical residency, old-boy network, Pearl River Delta, popular electronics, Silicon Valley, Steve Ballmer, Steve Jobs, union organizing, upwardly mobile, why are manhole covers round?

If the first officer had been the captain, would he have hinted three timesNo, he would have commandedand the plane wouldn't have crashed. Planes are safer when the least experienced pilot is flying, because it means the second pilot isn't going to be afraid to speak up. Combating mitigation has become one of the great crusades in commercial aviation in the past fifteen years. Every major airline now has what is called “Crew Resource Management” training, which is designed to teach junior crew members how to communicate clearly and assertively. For example, many airlines teach a standardized procedure for copilots to challenge the pilot if he or she thinks something has gone terribly awry. (“Captain, I'm concerned about...” Then, “Captain, I'm uncomfortable with...” And if the captain still doesn't respond, “Captain, I believe the situation is unsafe.”

pages: 415 words: 123,373

Inviting Disaster by James R. Chiles


Airbus A320, airline deregulation, crew resource management, cuban missile crisis, Exxon Valdez, Maui Hawaii, Milgram experiment, North Sea oil, Piper Alpha, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance

He’d erroneously concluded that his control over the trim stabilizer on the tail wasn’t working. It had indeed been working, but the response to emergency controls was so slow he hadn’t realized it. And after landing, the copilot had realized before McCormick had that a change in the thrust reverser setting would steer the airplane away from the fire station it had been heading for. A good system, and operators with good “crew resource management” skills, can tolerate mistakes and malfunctions amazingly well. Some call it luck, but it’s really a matter of resilience and redundancy. We know from cockpit records that surprisingly small problems can make for fatal distractions if this resiliency factor is absent. Early proof of its importance came on December 29, 1972, during the flight of Eastern Airlines Flight 401. The crew of this L-1011 jumbo jet had clear weather on the nighttime approach to Miami, and everything looked good until a green lamp on the landing gear status panel failed to light up after the command to lower and lock the gear.