In recent weeks the catastrophic crash of a Spanish passenger train, the lethal derailment on a French commuter line, the crash landing of a Boeing 777 in San Francisco and the deadly inferno caused by a runaway train in Canada have increased concerns about the safety of modern technology.
The causes of disasters are varied, multiple and often intertwined. The French derailment appears to have been caused by a metal clip that joined two rails — one of thousands on the rail system — working loose. Whether it was faulty design or inadequate maintenance remains to be determined. The Quebec disaster seems to have resulted from human errors that allowed more than 70 petroleum-laden tank cars to uncouple and roll toward the ill-fated town of Lac Megantic.
The interaction between automated controls, operator training and human psychology is rapidly changing the relationship between advancing technology and those who operate it. The 777 crash and the disastrous derailment in Spain most clearly exemplify this.
The designers of planes, trains and even automobiles increasingly automate some functions once performed by those who operate these conveyances, and from a safety standpoint, there is much to be gained by it. Published reports say the driver of the Spanish train boasted of having a taste for speed, and distracted by a cellphone call, his reactions to warning signals came too late as the train raced toward disaster. This raises the question of why the controls were not designed to override the driver's inexplicable conduct and automatically slow the train before approaching curves.
However, automation also has drawbacks. Pilots who become too accustomed to taking off and landing on autopilot may lose their expertise for flying manually when rare situations demand it. The 777 crew was required, for example, to put aside much of its high-tech apparatus and make a manual landing because part of the airport's instrument landing system had been shut down for maintenance. The rarity of situations requiring piloting without automated controls underscores the necessity for providing that experience through exhaustive training on realistic cockpit simulators.
Training and experience interact strongly with control automation. There may have been a heightened risk in the San Francisco landing because the pilot in control, although experienced on similar aircraft, was new to the 777. Next to him sat an experienced 777 pilot, but that pilot's limited experience as a trainer may have caused a fleeting hesitation to take over controls in an attempt to avert the crash.
And no matter how much experience in similar positions, how much simulator training to deal with rare occurrences, or how long a period under supervision, there must always be a first time in command of the controls. And when service is expanding rapidly or new equipment is being introduced, pressure increases to reduce the length of training and the amount of experience required before assuming control.
Experience was not an issue raised in the Spanish train wreck; the driver had many years of it. Perhaps psychological screening or improved training could have averted the tragedy, but even that cannot totally rule out the possibility of lapses in judgment, erratic behavior or incapacitation. When many people's lives are at stake, we have all the more reason to demand safety systems that cannot be overridden by errant operators.
Likewise, just as a commercial airliner must have a pilot and co-pilot, trains traveling at high speed should be required to have a co-driver who can handle telecommunications and other distractions and who has the authorization to take control away from the driver should the driver's incapacitation or reckless behavior demand it.
Although we cannot reasonably expect zero risk in transport systems, frankly addressing the challenges to increased safety is imperative to reducing as much of the risks as possible.
E.E. Lewis is a professor emeritus in the department of mechanical engineering at Northwestern University.Copyright © 2015, Los Angeles Times