Human-centred Automation

Corporate Risk Associates (CRA) is one of Europe’s largest integrated Human Factors, Safety, Risk and Reliability consultancies. CRA provides services to safety-critical industries including: Nuclear, Process Power, Rail, Business Risk and Defence.

In the nuclear sector, CRA’s human factors specialists have observed a growing niche for human and machine interface development. This is due to the update of nuclear control systems bringing increased use of software control. Thus, CRA is currently investigating methods for determining the appropriate level of interaction between operators and automation for monitoring and controlling process control tasks.

Three drivers for the rapid increase in automated systems are to: make tasks easier, more convenient or safer. Whilst these are laudable aims, many automated systems are developed with insufficient consideration of human requirements. Hence there is a tendency for the development of many automated systems to be seen as a purely engineering process, judged mainly against financial criteria with little or no consideration of the real human requirements. So, it is perhaps not surprising that numerous instances have been reported where automated systems have failed to meet users’ expectations and resulted in severe consequences.

These deficiencies can be avoided by taking a human-centred approach to the development of automation. This should start by undertaking a comprehensive task analysis of the human activities that are being automated, and understanding how humans will need to interact with the new system. This will provide vital insights into the human requirements to enable development and design of a system that operates effectively in a wide range of situations.

A major human factors concern for automated systems is situational awareness. This centres on the user’s overall understanding of the operation of the systems that are being monitored or controlled by the automation. It is preferable to design the automation system to enable users to be aware of important state data, whilst avoiding information overload.

Another important consideration is operator workload level. Whilst automated processing can certainly help to reduce the risks of human errors due to high workload, if the automated support lowers workload too far, this can greatly reduce vigilance. For this reason, the methods being developed for determining interaction levels between humans and automation are also assessing how the level of interaction can be changed as overall workload is increased.

It is important to enable users to diagnose automation failures, particularly when data may appear sound but is actually unreliable. Undertaking a human reliability assessment of the proposed system helps determine the potential for these situations to occur. By identifying potential error modes in this manner, the interfaces can be designed in ways that support operators in establishing any such failures.

To discuss human-centred automation needs, please contact:

Anthony Bell , Phone number 01925393282, Email: [email protected]