Identifying risks and emergent risks across sociotechnical systems: The NETworked Hazard Analysis and Risk Management System (NETHARMS)

Identifying risks and emergent risks across sociotechnical systems: The NETworked Hazard Analysis and Risk Management System (NETHARMS)

I found this pretty interesting from Dallat, Salmon and Goode, exploring a new systems-based risk assessment method called NETworked Hazard Analysis and Risk Management System (NET-HARMS).

One key focus on NET-HARMS was revealing the harder to find emergent risks that emerge via interactions.

WAY too much to cover in this paper – I’ve skipped large parts, especially around the construction and testing of their method, but there’s some interesting findings in the discussion. Hence, if you’re interested in this topic then check out the full paper.

For background:

·         “Accidents are now widely acknowledged to be a systems phenomenon, with the field of safety science now largely accepting that accidents are a result of multiple interacting contributory factors situated across entire work systems”

·         They say that ‘systems thinking’ is underpinned by the assertion that “accidents are produced by interactions between multiple human and technical elements, as opposed to the actions of one human, or one specific failure in isolation”

·         While a range of methods purport to evaluate a diverse subset of systems risks, many are based on retrospective analysis – things that have already occurred

·         Risk assessments in the safety science field are argued to be the most closely aligned techniques for forecasting accident factors

·         Risk assessment methods should be underpinned by systems thinking and examine the “entire work system”, which in practice means “as well as identifying risks at the sharp-end of system operation, risk assessment methods should also consider risks associated with supervisory, managerial, regulatory and even government actions and interactions.”

·         Based on prior work from this research team, >300 risk assessments were evaluated – they found that “the risk assessment methods they reviewed typically do not support the identification of risks outside of the front-line worker (e.g. pilot, control room operator, driver), nor do they support the identification of emergent risks (i.e. risks that emerge from the interaction of multiple risks across the system)”

·         Hence, most of the reviewed risk assessment methods “adopted a linear, chain-of-events philosophy and were thus not consistent with contemporary systems thinking models of accident causation”

·         There were, however, some exception – e.g. STPA and FRAM; although these haven’t yet become widespread in practice, and are criticised for being complex and time consuming

 

Development of NET-HARMS

I’ve skipped most of these sections, so check out the full paper. However, they drew on different techniques like Hierarchical Task Analysis (HTA) and SHERPA.

HTA was selected because “it is arguably the most popular task analysis method and has extensive reliability and validity evidence associated with it”. This was bolstered with a human error / human reliability method, being SHERPA (The Systematic Human Error Reduction and Prediction Approach).

They note that when used in their original forms, neither HTA nor SHERPA typically facilitate the identification of risks across overall work systems—meaning “human and non-human actors throughout the organisation who influence the design, development and delivery of the desired outputs”.

Therefore, they modified the analysis modes to better suit an entire work system approach.

They then demonstrate the application of the NET-HARMS approach for risk assessment for a five-day LED outdoor education program. Once again I’ve skipped most of this, so check out the paper.

Some points though:

·         A HTA was constructed for the entire system involved in the design, planning, conduct and delivery of the five-day program

·         They then undertook (optional) steps of assessing ordinal probabilities of the risk occurring (low, medium, high)  and the predicted criticality (low, medium, high)

·         Risk control measures were then considered and documented

 

Emergent risks

·         Task networks were then used to represent the HTA outputs in the form of a network, so that the key tasks and relationships between them in a work system are visible

·         Emergent risks “represent additional risks that arise as a result of the interaction between the risks identified during step 2”

·         At this step, the analyst “asks the question, ‘What is the impact of this risk happening at task X on the related task Y?’ The underlying principle in relation to linked tasks is that they will interact and, in the event that the initial task risks identified are not managed appropriately”

·         Risk controls are then applied to the emergent risks

·         They note that at this stage of analysis, “both the initial task risk and the linked task have risk controls developed which are designed to manage those risks. However, the emergent risks created by these two interacting tasks are additional (and different) to the initial risks and therefore, require emergent risk controls”

·         They then describe an example of an emergent risk, and how the controls may be different to the initial task risk

 

Notable findings:

Some things I found particularly insightful:

·         They compared the NET-HARMS outputs against the specific contributory factors extracted from 351 incident reports

·         They found that “Whilst a high hit rate was achieved [identifying the factors from the incident reports], a significant number of false alarms were also identified, with 1476 risks predicted via NET-HARMS that were not found in the injury incidents analysed by Van Mulken et al”

·         Interestingly, “NET-HARMS identified 1131 emergent risks associated with the design, planning and review tasks” whereas “whereas in the program delivery tasks (Section 4 of the HTA), 232 emergent risks were predicted”

·         Within the task risks stage, “1.5 times (n=141) as many task risks were predicted in the design, planning and review tasks (Sections 1, 2, 3 and 5) in the HTA”

·         “These numbers make a compelling statement. The largest amount of emergent risks reside within the tasks not associated with delivery of the activity”

·         Or as they argue, “Put another way, the tasks related to the design, planning and review of the program have the most potential for introducing risks into the system. If risk is not managed in these pre-activity stages, a significant number of emergent risks are created”


In concluding, they state:

·         “the case study demonstrated the existence of 5.8 times more emergent risks (NET-HARMS Stage 2) in the system than task risks … This is a sobering finding given that existing risk assessment methods do not attempt to identify emergent risks”


·         “The NET-HARMS method accurately predicted all but 4 of the 119 contributory factors identified in the injury dataset “

·         “Worthy of mention also were the very high number of false alarms (n=1476) in the analysis”

·         But rather than false alarms being a waste of time, “it is prudent to consider that a false alarm could indeed represent risks that have not yet played a role in injury-causing incident”

·         Moreover, the incident database they relied on “does not yet contain many higher-level factors due in part to poor reporting and limitations in practitioners’ understanding around systems thinking”

·         Finally, the findings “show that NET-HARMS is capable of forecasting systemic and emergent risks and that it could identify almost all risks that featured in the accidents in the comparison dataset”

In my view, while people may not end up using NET-HARMS in its entirety, it does highlight the present nature of emergent behaviour and how this could be partly evaluated.

Authors: Dallat, C., Salmon, P. M., & Goode, N. (2018). Identifying risks and emergent risks across sociotechnical systems: the NETworked hazard analysis and risk management system (NET-HARMS). Theoretical issues in ergonomics science19(4), 456-482.

Tom Shephard

Managing Principal Consultant | Author | My book reduces the risk of mass casualty events and catastrophic fiscal loss by tackling the least understood root causes.

2mo

To what degree are 'emergent risks' the product of ignorance/bad practice? PHA examples: ·         My LOPA experience, identified errors in 5-15% of EVERY PHA scenario. Some benign. Others, missed hazards (hidden).  ·         Epistemic Accident Theory. PHA gaps occur because a room full of 30-year veterans do not have the ‘all knowing’ knowledge and experience needed to identify every possible hazard.    ·         Societal Influencers (SI) . Bea, (investigated Katrina, others) posits that ignoring SI can increase true risk by several orders of magnitude, eg, SI affect from regs, biz environ, etc. Apply it to Macondo or Boeing 737 Max. Does likelihood move from implausible > possible?     ·         PHA gaps occur when assessing scenarios keenly affected by humans.  From my 14 year effort, hidden design errors that degrade human performance are common, human behavior assumptions are frequently wrong. ·         New hazards introduce when different depts (don't talk) progress mutually incomatible solutions on the same item.    ·         Critical transcription errors in PHA/LOPA reports appear more common than perceived.    ·         A missed hazard due to a poor node boundary selection.

Paul Chivers

Risk Advisor | Director

2mo

Been a fan of this approach, hopefully an eye opener to practitioners. Thank Ben!

I developed a systems thinking risk assessment method for hazardous manual tasks which includes the identification of emergent risks. It is available at HAMSTA.AU

Leonidas Brasileiro

Senior Manager Health, Safety & Environment | EHS | HSE | ESG | Process Safety | Resilience Engineering | Human Factors | Risk Management

2mo

Awesome. It really seems interesting. I'm tempted to dive into it. Risk assessment is something I find both important and, most of the times, frustrating. Given the effort it takes to be well done versus its accuracy, It always seems unbalanced to me.

Ben Hutchinson

HSE Leader / PhD Candidate

2mo

Tom McDaniel you might find this interesting

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics