Safety, Security & Risk Strategies: Disconnects & varying concepts for differing actors at various organisational levels
Not only do various layers of the same organisation conceive and implement safety, security, risk or resilience differently, but they also tend to have varying beliefs and practices on how to control visible strategic failures such as 'accidents', mistakes, errors and fiascos.
That is, while regulators may insist upon greater risk constraints, frontline actors remain dependent upon experience, education, knowledge and psychological factors to identify, mitigate and control a wide variety of actions, factors and probabilities.
This process and variance only compound and varies with greater layers such as teams, management and executive leadership.
As a result, good/bad strategy may be perceived as either the saviour or catalyst for failures and accidents but accidents may also be manipulated or positioned as an escape clause or 'act of god' (Force Majeure) that no human could have reasonably foreseen, therefore absolving management, leadership and organisations.
"Instead of “a faulty employee badly supervised”, it was, therefore, more appropriate to speak of “a new strategy with unanticipated and un-managed weaknesses” (Le Coze, 2019)
In other words, accidents within the context of safety, security, risk and resilience may be truly accidental or the product of a process, procedure and strategy. Or both.
While the above concept is derived from 'high-risk' systems and structures, it has application and utility for routine operations and systems.
Notwithstanding, the notion of 'high-reliability organisations' (HRO) requires universal revision in the wake of a pandemic that disproved or negated many industry assertions and confidence in such beliefs of 'high reliability', when in fact, they were not.
“people and organizations do not always know how far they are from the true limits or the extent to which limits are elastic, relative, or arbitrary. Therefore, progress in general, and exceeding limits in particular, entails ambiguity, risk and uncertainty” (Farjoun and Starbuck, 2007)
As a result, a strategy should not be seen as a panacea for mitigation or risk, safety, security or resilience errors, oversights or failures. Nor should it be attributed to the actions of just one or more people should the strategy fail. While both may be true upon analysis and investigation, much can be said about organisations, governments and communities that make such rapid-fire determinations without analysis and the speed upon which people are blamed or punished.
" safety is produced in the context of specific strategies"
(Le Coze, 2019)
Especially where strategy remains the construct of the few or that of one dominant perspective.
“in a world of check and balances, when there is no real countervailing force to a CEO, individual preferences can dominate” (Finkelstein, 2003)
In sum, beliefs, practices and application of safety, security, risk and resilience vary up and down an organisation's hierarchy.
This is perhaps most obvious or visible in the event of an accident and how blame or accountability is assigned.
Especially when viewed as a 'strategic drift' from that which is documented and espoused, compared to that which is measured or prioritised in understanding how any unplanned, negative, or dangerous event occurred.
In short, power and influence may blame individuals and never review the system or construct that contributed to events and outcomes, including accidents.
Overall, safety, security, risk and resilience strategy alone is inadequate to create, alter or sustain front line practices and culture where real hazards, threats, dangers and perils remain routine risks, in addition to organisation and business risks which vary in scale and complexity up and down an organisational structure and formations.
Tony Ridley, MSc CSyP MSyI M.ISRM
Security, Safety, Resilience & Risk Management Sciences
References:
Farjoun, M., Starbuck, W. (2007). Organizing at and beyond the limits. Organization Studies. 28 (4), 541–566.
Finkelstein, S. (2003) Why Smart Executives Fail: And What You Can Learn from Their. Mistakes, New York: Portfolio.
Le Coze, J. (2019). Safety as strategy: mistakes, failures and fiascos in high-risk systems. Safety science, 116, pp.259-274.
Auditor-Contador Público Autorizado, Certificado en NIIF, Doctor en Educación con Mediación Pedagógica, en Contab. de lo Complejo
2yYHON Y TONY claro, después de todo el riesgo sesga la mirada del profesional y encarece el proceso, así lo indico el premio novel en economía el Doctor Hayech, ahora bien aunque se utilice lenguaje militar como estrategia riesgo, nada de eso es natural a la empresa a la institución, y no legitimiza nada, líneas de defensas fase i y todas esas cochinadas no es mas que posiblemente si ustedes lo quieren una mirada estocástica, para evitar, reducir o eliminar la incertidumbre, el azar que si es científico y que el Dr.Heisenberg premio nobel en fisca si indico claramente, claro ustedes mientras que observan los tres puntos de las líneas de defensa-observen el lenguaje militar de muerte que contiene esa mirada de muerte- nunca sabrán la posición no el tiempo de una reacción simultáneamente y con precisión arbitraria, ciertos pares de variables físicas, porque simplemente esta impedidos-Saludos que la indeterminación los acompañe.