Why Risk Management Students Think in Matrices But Act Without Judgement
Risk management graduates can populate a heat map but freeze when asked to make a real escalation call — understanding why reveals a fixable gap in programme design.
Ask a risk management graduate to draw a 5x5 likelihood-consequence matrix and they will do it fluently. Ask them to populate it for a realistic operational scenario and they will do it carefully. Then ask them to act on it — to call an escalation, challenge a senior colleague's risk tolerance, or recommend halting a project because the residual risk exceeds appetite — and the confidence disappears. The tool knowledge is there. The judgement to deploy it consequentially is not. That is the core employability problem in risk management education.
The Matrix Is a Tool, Not a Decision
Risk matrices, heat maps, FMEA templates, and bow-tie diagrams are all decision-support tools. They structure thinking. They do not make decisions. The professional skill that employers consistently report as underdeveloped in new risk hires is the ability to move from structured analysis to confident, defensible action — especially when the action involves challenging something or someone with more organisational authority. That skill is not developed by completing more risk analysis templates. It is developed by repeatedly making consequential choices and observing what happens.
“Employers consistently rate decision-making under uncertainty and the ability to influence without authority as the two most critical gaps in graduate-level risk management hires.”
— Chartered Insurance Institute Graduate Skills Survey, 2024
What 'Thinking in Matrices' Actually Looks Like
The pedagogical pattern that produces matrix-thinking students is well-intentioned but structurally flawed. Students are taught that risk management is a systematic process: identify, assess, treat, monitor. They are assessed on their ability to execute each stage correctly in isolation. The identification stage is examined via a PESTLE analysis. The assessment stage is examined via a populated risk register. The treatment stage is examined via a written evaluation of mitigation options. What is never assessed is the integration of all four stages under time pressure with incomplete information and competitive stakes.
The result is a student who knows the process abstractly but has never run it in conditions that resemble professional practice. When they arrive in a risk role and are asked to manage a live supplier crisis while simultaneously preparing a board-level risk report, the systematic process they learned fragments under pressure because they have never practised integrating it.
The Role of Consequence in Developing Judgement
Judgement develops through consequence. That is the core insight from research on expertise development across disciplines, from medical education to military training to financial services. The mechanism is simple: a decision produces an outcome, the outcome is observed, the decision-maker updates their internal model of how the world works, and the next decision is better calibrated. Without consequence — without the simulation of outcome — the feedback loop that builds judgement does not operate.
This is why simulation is not a luxury addition to risk education — it is a structural requirement for achieving IRM-level competency outcomes. SPPIN Sim creates consequence by making every risk decision visible in its downstream effects. A team that under-invests in supply chain diversification because the diversification cost feels high will see their resilience score decline across subsequent turns. That visible consequence, experienced in the moment of decision, is the mechanism through which risk judgement develops.
Redesigning Assessment to Surface Judgement
Programmes that want to close the judgement gap need assessment instruments that can observe decision-making behaviour, not just describe it. A post-simulation reflective report that asks students to analyse the risk decisions they made during a live session — comparing their stated appetite with their revealed behaviour, explaining what triggered deviations, and evaluating the consequences — is a far more valid measure of IRM competency than an essay written about a historical case study.
SPPIN Sim as a Risk Judgement Development Environment
SPPIN Sim's risk simulation module was designed specifically to address the judgement gap. Sessions inject live crisis events derived from real-world news sources, forcing students to apply risk frameworks in conditions of genuine uncertainty. Competing teams create organisational dynamics — peer pressure, competitive stakes, visible leaderboard positioning — that replicate the social context of professional risk decisions. Tutors control the intensity of disruption, the frequency of events, and the transparency of competitor data, allowing the simulation to be calibrated to the experience level and learning objectives of the cohort.
See it in action
Book a free demo and watch the simulation run live with your cohort.
Book a free demo