I recently attended IASS, the 69th annual International Air Safety Summit, from the 14 – 16 November, in Dubai. The Aviation Safety event draws as many as 325 representatives from 50+ countries to exchange information and propose new directions for making aviation, the safest mode of transportation, even safer.
There were some fantastic presentations at the event and lots of learning outcomes to take away and consider. One key outcome I took from the event was around ‘automation in flight’; there were a number of presentations on this topic, each had a different slant but led to a similar viewpoint: accidents and complacency, improving the skill base of pilots and the worry that pilots are becoming supervisors of systems instead of developing their skill set.
Two presentations stood out for me: “Back to basics” - Capt. Henry Donohoe, and “Facilitation and Frustration – Automation in Day-to-Day Operations” - Dr Nicklas Dahlstrom, Human Factors Manager, Emirates.
Both presenters are well known, have vast experience and great insights into the aviation and safety industry. The “Back to basics” presentation raised the idea of pilots focusing on being pilots and not being so dependent on the use of automation. There is a growing concern that pilots are flying less and completing tasks more, automation can now mean task management for pilots. Should pilots be flying planes or completing tasks from a machine? The use of automation has meant: active to passive controls for humans. A high usage of automation means that aviators are becoming system monitors which can lead to boredom, which can develop into automation complacency, the below taken from Capt. Henry Donohoe’s presentation shows the key factors: Is boredom and automation complacency connected?
• Failure to notice
• No cross checking
• Failure to monitor
• Inappropriate automation
The next presentation, “Automation in Day-to-Day Operations” focused on workload, how flying increasingly requires the monitoring of systems and the growing lack of understanding of automation.
Workload, with an automated system pilots can go from periods of low concentration and task orientation to then be expected to carry out complex procedures in an intense and pressured environment. A varied demand workload isn’t the ideal working process and can lead to mistakes.
The monitoring of systems is also complicated and isn’t merely watching a few dials or gauges. The issue with monitoring systems over a long period of time with large variant changes is the concentration and focus on broad monitoring and narrowing monitoring. Pilots have to focus on two very different data sets over a long period of time with the extra demanding elements of flying a plane full of passengers through a demanding flight schedule.
The final key issue is our understanding of automation, we all know what it is and what it is supposed to do, but do we understand it? Do we know enough so we can resolve a problem when it goes wrong or have confidence in the data or commands when there has been a previous error? We are becoming too reliant of automation in all walks of life, how many people would ever question a computer to be wrong in 2016? Many no longer understand how automation works so wouldn’t be able to correct it or have the skills and experience if it did malfunction.
Our dependence on the technology known as ‘Cognitive reliance’ is where humans have lost the art of how to solve problems across all walks of life. We no longer have the inclination or skills to find new solutions to problems. Our tech has taken over, and we don’t need to think and consequently we have become too reliant on technology.
To illustrate his point, Dahlstrom used the example of individuals who learn to do basic maths using pencil and paper or even an abacus. Over time, these individuals develop a mental model and can do mathematical problems in their heads. "If you use a calculator, you're no better today doing math than you were yesterday," he said. The same is true for pilots using automation. "We need to train pilots on how to think and not just what to think," he said. Automation must support the structures of thinking rather than replace them, Dahlstrom said. Dahlstrom concludes with a three step process on how best to work with automation:
• Understand – How does it work?
• Anticipate – What should it do?
• Evaluate – Doing as commanded?
I recently read an article about sci-fi films and how they can show us a possible future. The popular 1982 film, Blade Runner focused on the core theme of “What does it mean to be truly human?” The concept of the film; “What is human intelligence and are we going to be replaced by androids?” was set in 2019, now only two years away. This sci-fi classic highlighted the issues we have with computers and machines and showed the differences between humans and androids. I couldn’t help feel some relevance in the recent concerns in automation, are we being replaced by machines?
The fatal accident of an Airbus a330, Air France flight 447, 31 May 2009 is a real life tragedy which showed how our reliance on automation creates confusion, but also gave evidence of how flight crews can become depleted in the necessary skills of flight without automation.
As with any major aviation incident, there are a number of factors that cause an accident. These can be crew error, mechanical malfunction or weather conditions to list but a few. The a330 was seen as one of the most technological advanced with a cutting-edge autopilot system. One of the suggested factors for this accident was the flight crew’s reliance on automation. Having a plane that is presumed so safe can lead to a reliance on the technology from humans. This was an example of pilots losing faith in automation during an emergency and with little experience or understanding of the systems, to correct the problem. It also shows how humans will question themselves instead of automation and that pilots go from low activity to then high demand during a flight.
This real life example, like any accident, had a number of factors. I am not suggesting that automation was the key one, but it does allude to some of the elements mentioned at the start of this blog.
Automation is undoubtedly a positive force in the Aviation industry, however, human intelligence and judgement are still the most important factors in all in-flight decision making and it’s important that we don’t overload pilots with data, tasks and technology. Sure, we are not going to be replaced by androids anytime soon, but the fact that there is a growing increase in en-route accidents; 2012 3, 2013 8 and 2015 11, and growing academic research to describe automation reliance, shows how this problem is growing as we simulate with technology. Just look how many terms we have for it; “Automation complacency”, “Mode confusion”, “Cognitive reliance” and “Paradox of automation”. Automation shouldn’t become a hindrance to a flight crew but should still be a positive aide to flight safety. We must ensure we are not becoming complacent or symbiotic to automation/technology and keep allowing skilled pilots to fly. Finally, I think this quote from Earl Wiener can sum up the problems with automation in aviation, “Automation will routinely tidy up ordinary messes, but occasionally create an extraordinary mess” Earl Wiener – Wiener Laws of aviation and human error.
Ideagen solutions can allow a crew to focus on their main objective, flying. Our products take care of the compliance and data that the Aviation business creates in 2016. Businesses in the aviation industry face a growing challenge of maintaining and monitoring a variety of business systems that serve different purposes, yet all contain vital operational data. These systems help managers cope with vital operational and logistical components of the business, contain absolutely mission critical data and include Crewing and Rostering, Training and Competence, Flight Planning, FDM, Safety Reporting and Engineering Systems.
To find out more about Ideagen’s aviation products and services click here https://www.ideagen.com/industries/aviation/