HP considerations in automated systems
Introducing automated systems does not mean that humans are no longer relevant. If anything, to achieve the desired system performance outcomes with automated systems, it is critical to consider how people will be expected to interact with that system, and how it will change their roles and responsibilities.
As with any change, increasingly automated systems present the possibility of new risks. The safety and efficiency benefits associated with new and increasingly automated technologies can only be realized when they are designed to support the performance of the user as well as the performance of other personnel who are directly or indirectly affected by its implementation (including personnel who will need to maintain the technology).
Despite the intended benefits of automation, an extensive body of literature has been developed over the past several decades identifying human performance issues associated with automated systems including:
- Use of automated systems can reduce workload during traditionally low workload phases but may add workload or interfere during time-critical, dynamic circumstances.
- When automated systems consistently perform well, the human may develop over-reliance on the automated system, which can contribute to skill degradation.
- Automated systems may shift the human's role from active controller to supervisor or monitor, something people are not particularly good at doing.
A human-centred approach is key to designing automation and supporting its implementation within normal operations. With respect to automated systems, it is particularly important to address the following HP questions:
What is the rationale for automating?
It is important to ask why a particular function should be automated because automation may not always offer the best system solution. Is it for workload reduction? Performance enhancement? System scalability? How does it fit with what people do now and will have to do if the function is automated? How does it affect people in other parts of the system, or our passengers and customers? The argument for increasing automation to reduce human error is often misguided because automating a function or system may just shift the possibility of errors from the operator to different parts of the system, with different consequences.
Are the human performance expectations and user responsibilities clearly identified?
- Automation results in new user interactions that require training and practice, often in addition to what is required for "manual" operations.
- An automated system that encounters conditions outside the operating environment envisioned by the designer may suddenly cease to perform its function. In such cases, recovery may depend on a rapid response by the human.
Does the automation display appropriate information to allow the user to meet their performance obligations and their responsibilities?
- Information about system function is critical for users to understand the system, to know what it is doing, and to calibrate their trust appropriately.
- Too much information about system functioning can result in information overload and clutter.
- Lack of feedback on system functioning makes it difficult for the human to be aware of and to understand how the automated system is working and how to predict what it will do next.
- As long as humans are responsible for a task, they must have the appropriate authority to exercise that responsibility. In such cases, automated systems not only need to provide sufficient information through displays, but also provide means for human intervention through controls (e.g., manual override).
How are automation surprises mitigated?
Automation may surprise the human user when:
- the user is expecting one behaviour, but the automated system exhibits another behaviour;
- the automated system unexpectedly transfers control to the human; and
- complex system interdependencies result in unexpected changes in state or mode.
What design features have been included and what procedures are necessary to mitigate such automation surprises?
What knowledge and skills does the user to manage the automation in normal and abnormal situations?
- Automated systems change existing tasks, create new tasks and introduce different error types. Automation results in new user interactions that require training and practice, often in addition to what is required for "manual" operations.
- An automated system that encounters conditions outside the operating environment envisioned by the designer may suddenly cease to perform its function. In such cases, recovery may depend on a rapid response by the human. Does the user have the necessary skills and knowledge to respond as expected?
- Lack of practising a task that has been automated may degrade human proficiency of motor and cognitive skills and knowledge needed when the automation fails. How is possible skill degradation addressed?