Donald Trump and the ironies of automation

Donald Trump and the ironies of automation

Resorting to commonplaces to explain accidents

Serious accidents are often commented upon by prophets of doom, who explain them with commonplaces and provide simple explanations to an inevitably complex problem.

 

Blaming the human

The problem is man”, with his faults and the intrinsic tendency to make mistakes. Many said so after the Costa Concordia disaster, the cruise ship that sank a short distance from the port of the Isola del Giglio on January 13, 2012. Commander Schettino apparently behaved in a cowardly manner while efforts were underway to save the wounded. Therefore, he was the perfect figure to justify such a simple explanation and serve as a scapegoat. Consequently, hardly anyone listened to those who noted that the practice of the inchino or taking a bow before the Isola del Giglio, consisting in going off route to salute the island, was very common in the cruise world. Many ships had navigated too close to the coast before, but had never been reported to the competent authorities.

Also, many overlooked the serious lack of training of the Costa Concordia crew, which emerged during the most frantic phases of the emergency management.

 

Blaming technology, part I

The problem is obsolete technology”, someone uttered after the Andria-Corato train collision, July 12, 2013, which caused 23 victims. The accident occurred on a single-track line where an old traffic management system was still operating, called Telephonic block. This system, no longer in use in other Italian railway networks, was used in a section not already transitioned to a double track line.

Having found an apparently convincing culprit in obsolete technology, the voices of those who argued that technology does not always translate into greater safety had difficulty emerging. In addition, over 40% of the Italian railway network still runs on a single-track system. Also, no significant accident had occurred before in the Andria-Corato section, that the Ferrotramviaria company had managed since the 1950s.

 

Blaming technology, part II

The problem is technology, which is too advanced”, some said after the Ethiopian Airlines Flight 302 accident, March 10, 2019. Planes have become too complex to pilot. Traditional aircrafts are better, with their lower levels of automation and a person still in full control of emergency management. No less than the US President Donald Trumpsupported this view. It has become even more popular following two accidents involving the same aircraft model just five months apart. The technologically advanced Boeing 737 MAX 8, ended up in everyone’s sights.

 

 

The real protagonist of the Ethiopian Airlines accident may well have been a software that automatically manages certain phases of the flight. In particular, a function called MCAS (Maneuvering Characteristics Augmentation System). The MCAS intervenes when there is a risk for the aircraft to stall. Investigations into the data contained in the “black boxes” has only just begun. We still don’t know how the pilots reacted to the circumstances prior to the accident. Similarly, we do not know whether the suspected software transmitted actually incorrect data. But the Lion Air Flight 610 accident of October 29, 2018, involving the same type of aircraft, inevitably leads to comparisons. Furthermore, also in this case there is increasing suspicion that the accident was caused by bad communication between human and machine.

With these accidents, once again questions arose concerning automation, a matter well known to Human Factors experts. Human Factors studies how people in a complex system interact with other resources, including other operators, procedures and technology.

 

Ironies of automation

In the 1980s, cognitive psychologist Lisanne Bainbridge introduced the concept of Ironies of Automation. She explored four paradoxes that designers of automated systems always encounter when transferring functions from human operators to automation in complex operating environments.

The first paradox lies in the choice to return manual control to operators only when automation does not work. In this scenario, the operator may no longer have the skills necessary to perform the task. In fact, if they do not carry out a task for a long time, performing it with the same speed, effectiveness and precision as when they executed it regularly can be difficult.

The second paradox closely links to the first. If a system requires the operator to act only in case of problems with automation, the most reliable automated systems will give operators fewer opportunities to exercise their skills, thus making them less prepared to act during emergencies.

The third paradox concerns the choice of transferring executive tasks to automation, leaving the operator to supervisewhat automation is doing. Certainly, this frees operators from most of their workload. However, the human cognitive system is unfit to perform supervisory tasks for long, especially when the system supervised does not change significantly. In which case, the operator easily commits errors.

Finally, automated devices can struggle in managing unexpected situations; for example, when conditions change suddenly or in unforeseen ways. Whereas, humans have a unique, precious skill: they can observe the environment, check what is happening, and connect pieces of information to understand how to tackle the unexpected. However, this complex mental processing requires some time and undivided attention. Therefore, it cannot be done in complex systems such as aviation, where emergencies are almost always time critical. Hence, the fourth paradox.

 

The right level of automation

No Human Factor expert would ever dream of claiming that technology, as such, has negative effects on safety. On the other hand, aviation is evidently reducing the number of accidents, also thanks to constant technological improvements.

However, the use of automation to increase safety is more complex than we tend to imagine. And those who design automated systems must necessarily take into account the ironies of automation. In fact, increasing the level of automation can be counterproductive in some cases. When automation performs predominant functions in comparison to those the operator performs, it may affect the operator’s reliability.

A change of perspective when designing systems is therefore necessary. Designers must shift their focus from minimising the human presence – considered unreliable – to making this residual presence more effective. As complex systems consist of both people and machines, there is no other way to increase their overall safety. What we gain in terms of reliability, by increasing the level of automation, risks otherwise being lost to a worsened human performance.

Which are the most effective (and resource efficient) ways of learning from incidents?
How can we use safety data to enhance operations and design?
Contact us to know about latest state-the-art work!

Get in touch with us