Concept Guide: Risk Assessment & Management for HSE

Purpose

At Level 7, engineers do not need to be taught that “Risk = Likelihood × Severity.” Instead, this Advanced Reference Sheet explores the limitations, biases, and strategic failures inherent in standard risk management tools.

Goal: To equip Senior Engineers with the intellectual tools to audit and challenge the risk assessments provided to them by consultants or junior staff. You must learn to spot when a mathematical model is masking a critical safety flaw.

Advanced Concept 1: The Ontology of Risk (Uncertainty vs. Ambiguity)

Standard risk management treats risk as a “puzzle” to be solved with data. In complex engineering, we face two distinct types:

  1. Aleatory Uncertainty (Statistical Risk):
    • Randomness inherent in a known system (e.g., component failure rates).
    • Managed by: QRA, redundancy, and spare parts.
  2. Epistemic Uncertainty (Knowledge Gaps):
    • We don’t know what we don’t know (e.g., how a new alloy behaves in 50 years).
    • Managed by: Safety margins, pilot testing, and “Precautionary Principle.”

Level 7 Insight: Most major project failures occur because managers apply Aleatory bntools (like QRA) to Epistemic problems—creating a false sense of precision.

Advanced Concept 2: Critiquing Risk Models (The “Black Box” Problem)

The Trap: Consultants often provide impressive Quantitative Risk Assessments (QRA) filled with complex calculus. Managers accept the “final number” (e.g., 1×10−6 fatality risk) without questioning the inputs.

Model Risk Checklist:

  • Input Sensitivity: If you change one assumption by 10%, does the safety margin collapse?
  • Independence Assumption: Does the model assume two backup systems are independent? (In reality, a fire might destroy both cables simultaneously—”Common Mode Failure”).
  • Data Validity: Is the failure rate data from 1980s generic industry tables, or specific to your operating environment?

Advanced Concept 3: Inherently Safer Design (ISD) vs. Control

The “Hierarchy of Control” is often taught as a list. At Level 7, it is a design philosophy.
The Trevor Kletz Principle: “What you don’t have, can’t leak.”

Standard Safety (Bolt-on)Inherently Safer Design (ISD)
Control: Add a cooling jacket to a runaway reactor.Intensification: Use a smaller reactor so a runaway contains less energy.
Control: Add gas detectors and alarms.Substitution: Use a non-volatile chemical.
Control: Write complex procedures.Simplification: Design the pipework so it cannot be connected wrongly.

Strategic Rule: Any risk assessment that jumps straight to “Engineering Controls” (guards/alarms) without proving that ISD was rejected is defective.

Advanced Concept 4: Cognitive Bias in Risk Estimation

Engineers like to think they are objective. Psychology proves otherwise. When estimating risk, your team is subject to:

  1. Optimism Bias: “We’ve done this 100 times before.” (Underestimating the probability of failure).
  2. Confirmation Bias: Seeking data that supports the decision to proceed, while ignoring “weak signals” of danger.
  3. Groupthink: Junior engineers fearing to challenge a Senior Manager’s “gut feeling.”
  4. Sunk Cost Fallacy: “We’ve spent £5M on this design; we can’t change it now just because the risk is higher than expected.”

Operationalization: A Level 7 Risk Review meeting must explicitly appoint a “Devil’s Advocate” to counter these biases.

Advanced Concept 5: Risk Communication as Strategy

The “Risk Thermometer” vs. Public Perception:

  • Technical Risk: “1 in 10,000 chance of death.”
  • Perceived Risk: “Is it voluntary? Is it dreadful? Do I trust the company?”

The Sandman Formula:

  • Risk=Hazard+Outrage

Strategic Application: If you ignore the “Outrage” factor (public fear, union concerns), your technically perfect risk assessment will be rejected by stakeholders. You must manage the perception of risk as carefully as the physics.

Targeted Critical Questions

  1. Critique: Why is a QRA dangerous if used as the sole basis for decision-making?
  2. Analyze: How does “Common Mode Failure” undermine the “Protection Layers”
    in a Bow-Tie diagram?
  3. Evaluate: Review a recent project in your workplace. Was Inherently Safer Design (ISD) considered, or did the team jump straight to “adding safety devices”?
  4. Reflect: Which cognitive bias is most prevalent in your organization’s management team? How does it skew risk acceptance?

Learner Task: Methodology Audit

Task Overview:

You are not asked to do a risk assessment. You are asked to audit one. Select an existing Risk Assessment or Safety Case (e.g., a QRA, a HAZOP report, or a complex Method Statement) from your workplace.

Your Mission:

Write a Critical Review Paper (1,500 words) titled: “Limitations and Hidden Assumptions in [Project X] Risk Strategy.”

Required Sections:

  1. Epistemic Critique: Identify where the assessment treats “unknowns” as “facts.” (e.g., “Assumed maintenance will happen every 6 months”—is this realistic?).
  2. Bias Check: Identify evidence of Optimism Bias or Groupthink in the risk scoring. (e.g., Are all catastrophic risks conveniently scored as “Rare”?).
  3. ISD Gap Analysis: Could the hazard have been eliminated by design? Why was this not done?
  4. Strategic Recommendation: Advise the Board on whether this risk assessment is robust enough to rely on for a £1M+ investment decision.

Note: This task tests your ability to judge the quality of thinking of others, a core skill for a Level 7 Technical Director.