By Stephen L. Carter
So it looks deliberate. On Thursday morning, French prosecutors said the co-pilot of Germanwings Flight 9525 deliberately crashed the plane into the Alps on Tuesday. "The co-pilot through voluntary abstention refused to open the door of the cockpit to the commander, and activated the button that commands the loss of altitude," said prosecutor Brice Robin. The co-pilot's intention, Robin said, was "to destroy the aircraft." He was alive at the moment of impact.
The chilling part is that the theory, if true, illustrates the ease with which the very devices created to make flights safer can be turned against their purpose. One of the pilots left the cockpit for perfectly innocent reasons, and then, when the emergency arose, couldn't get back in because the security design worked against him.
Since the Sept. 11 attacks, federal law has required that the cockpit be protected by "a rigid door in a bulkhead between the flight deck and the passenger area to ensure that the door cannot be forced open from the passenger compartment." The door must "remain locked while any such aircraft is in flight except when necessary to permit access and egress by authorized persons." The door may be unlocked with a key, but they key may not be possessed "by any member of the flight crew who is not assigned to the flight deck." Most airlines around the world comply with those U.S. rules.
According to its operating manual, the A320, like most passenger aircraft, has an electronic keypad that can be used to unlock the door. As a standard security measure, however, such keypads can be disabled by the pilots.
That's the point. Whoever is in the cockpit can lock everyone else out. This makes sense if one is trying to prevent a hijacking. It becomes a problem when the pilot turns out to be the bad guy. In the case of Flight 9525, if the prosecutors are right, the co-pilot, determined to crash the plane, would have disabled the keypad, making egress impossible in the time remaining.
These events stand as a chilling reminder of how difficult it is to harden our systems entirely against attack. The human factor is always a variable for which we cannot fully account. Eric Schlosser, in his book Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, tells us how planners agonized for decades over how to prevent a crazed individual from stealing or detonating a nuclear weapon. Even if guarded against outsiders, the systems couldn't be completely protected against insiders. His chilling conclusion is that the problem was never really solved: We've just been lucky.
It's likely that pilots have been locked out of cockpits before, but always by accident. Their colleagues doubtless have let them back in, and nobody's given the matter another thought. Probably the incidents were never even logged. Even if they had been, it's not likely much would have changed. As the sociologist Charles Perrow notes in his book Normal Accidents, we rarely take precautions against incidents that seem trivial at the time they occur.
In hindsight, we can now see the cost of the security we've put into place. But there's no obvious fix. Plainly we can't forbid pilots to leave the flight deck; nature may always call. And having gone to all this trouble to harden cockpit doors, it would be silly to begin softening them again. The deployment of some sort of emergency unlocking device would be asking for trouble.
Maybe someone will come up with a clever and effective solution. But no matter how layered and complex our security systems, we'll never be able to remove the human element. And there is always the risk that an insider will thwart the system.
Stephen L. Carter is a law professor at Yale University