13 August 2019 |
Siobhan Corrigan is a researcher at Lero, the Irish software research centre backed by Science Foundation Ireland. In this interview she discusses the nature of risk and the complexity of human systems.
What attracted you to organisational psychology?
During my primary degree it was the subject that I enjoyed most and felt that I was most suited to and this was reinforced when I undertook my third year placement working with a team of organisational psychologists.
I started in the University of Ulster as an undergraduate in applied psychology went on to the University of Sheffield to complete my masters in occupational psychology and completed my PhD in the school of psychology in Trinity College and have been in Trinity ever since.
What is a ‘human system’?
The CIHS is one of the leading human factor research groups in Europe and for over 25 years our research has focused on the human role in critical complex operational systems.
The products and services we take for granted in the 21st century are the outputs of complex human systems. Transport, healthcare, security, education, finance, the Internet, politics. Vast, complex, interdependent systems of individuals, organisations and technologies interact to innovate, design, develop, finance, regulate, certify, produce, test, localise, market, sell and deliver these to us.
At the core are the ‘humans’ designing, operating, managing and improving the system to produce results.
As consumers and citizens we are rarely conscious of these systems until they let us down. Sometimes this is in small ways – a medical appointment is rescheduled, a flight is delayed etc. Some failures, though, can be catastrophic, such as when an aircraft crashes or the recent cervical screening controversy.
Designing for, and managing, the human and organisational factors are central to the safe, commercial and environmental sustainability of current and future systems.
Any change process requires an acceptance of risk. What sectors are particularly sensitive to risk?
A whole variety of industries and public organisations (health, emergency services, financial services, aviation, road transport, and pharmaceuticals) deliver essential products and services but the risk involved in production and the delivery need to be reduced to an absolute minimum. Such systems are becoming more complex and integrated, creating new risks as the public tolerance of risk gets lower.
The online MSc in managing risk & system change at Trinity College commenced in 2015 and has attracted senior management from risk-critical industries across the globe. This programme was developed in order to bridge the knowledge gap between industry and research outputs.
This programme provides a rigorous but practical focus on risk, change and system design in operations with an innovative and integrated approach to the role of people in such systems.
As this programme is based on ongoing collaborative industrial research it provides a core human factors and operational assessment methodology for use in predictive risk assessment, effective change management and implementation, systems design and technology evaluation.
We hear a lot about automation and how it will lead to a clearout of jobs. What changes do you think automation will bring?
A range of industries are embracing the move to automation. Without a doubt automation has the potential to assist humans with a diverse set of tasks and to improve overall system performance in many contexts.
However there is more involved than merely substituting the human element out and replacing it with a technical one. There are a whole range of new considerations, the scale of which industries are only beginning to appreciate.
For example, the implementation of automation will have a significant impact on organisational structures, workforce behaviours/learning and organisational changes that leaders will have to facilitate, as automation transforms entire business processes, as well as the culture of organisations.
In addition, automated systems are not always 100% reliable. Therefore, the existence of errors in human-automation interactions cannot be eradicated completely. When automation fails, the human system must step in to detect, explain, and rectify the failure or breakdown and to understand the reasons for it so corrective actions can be put in place.
There is a requirement to focus on the future human factor requirements to facilitate guidance on the future automation developments necessary as a consequence of technical, regulatory, legal or other change drivers.
One of Lero’s areas of interest is self-driving cars. What challenges does this new technology bring?
The automotive industry is changing at a ferocious pace. We are repeatedly hearing that there will be more change in the next 5-10 years than it has in the last 50 years. However, I do feel that even by 2050 that there will still be mixed fleets and this will bring many challenges. To name but a few: user and societal acceptance/trust; understanding the risks involved in transferring control; liability/legal framework (who is liable if something goes wrong in the transfer of controls?); and ethics for responsible innovation.
From a driver perspective it’s difficult to accept or trust things you don’t understand. Also, to accept something you need to experience it – but the necessary level of testing will be difficult and expensive. Drivers will be afraid of what they don’t understand – and won’t know the capability of self-driving cars.
Much of the focus is on technology readiness but this needs to be supplemented by further research in order to exploit fully the potential benefits of automation in road transport in a manner that is safe, understandable and acceptable to the driver and other stakeholders.
There are exciting times ahead for research in this area.
Published at Tue, 13 Aug 2019 11:00:47 +0000