The Biggest Barrier to Digital Learning Isn't Students — It's Educators (And How to Fix It)
Research using ISM/MICMAC analysis identified educator skill gaps as the single highest-driving-power barrier to digital transformation in UK universities. Here is what that means for programme directors — and what to do about it.
When a digital learning initiative fails in a UK university, the post-mortem almost always looks in the wrong direction. Students did not engage. The platform was confusing. The timing was wrong. These explanations are sometimes true, but they are almost never the root cause. The root cause, consistently identified in rigorous causal analysis of digital transformation in higher education, is educator skill gaps. Faculty who are not confident designing and running digital learning experiences cannot create them well enough for students to engage with them meaningfully. The tool is not the problem. The pedagogy is.
What the ISM/MICMAC Analysis Actually Found
ISM (Interpretive Structural Modelling) is a technique for mapping causal relationships between variables in a complex system. MICMAC analysis then plots each variable by its driving power — how much it influences other variables — and its dependence — how much other variables influence it. When applied to digital transformation barriers in UK HE, educator skill gaps consistently appear in the upper-left quadrant: high driving power, low dependence. That means it shapes almost everything else in the system and is relatively unaffected by other factors. Fix educator capability, and multiple downstream problems improve simultaneously. Ignore it, and every other investment in digital learning infrastructure underperforms.
“Educator skill gaps are the number-one barrier to digital transformation in UK universities — the factor with the highest driving power in the entire causal system. Technology investments that bypass this variable consistently underdeliver.”
— ISM/MICMAC analysis, UK HE digital transformation research
The Three Dimensions of the Skill Gap
The educator skill gap is not a single deficit. It has three distinct dimensions that require different responses, and conflating them produces ineffective interventions.
- Technical confidence: knowing how to operate the platform well enough to avoid being visibly flustered in front of students during a live session
- Pedagogical design: knowing how to structure a digital learning activity so that it produces the intended competency outcomes rather than just surface-level engagement
- Assessment integration: knowing how to connect the activity to formal assessment in a way that satisfies quality assurance requirements and produces defensible evidence of learning
Most digital upskilling programmes for educators focus almost exclusively on the first dimension. They run workshops on how to use the LMS, how to record a video lecture, how to set up a quiz. These are useful but insufficient. An educator who can technically operate a simulation platform but has no framework for how to brief students, how to run the debrief, or how to map the experience to assessment rubrics will produce a session that students enjoy but cannot explain the value of — which is a short path to the activity being cut from the programme.
Why Traditional CPD Does Not Solve This
The standard institutional response to skill gaps is continuing professional development — a two-day workshop, an online module, a lunchtime seminar series. These interventions have two structural problems when applied to digital learning skills. First, they are temporally disconnected from practice. An educator who attends a simulation workshop in October and does not teach again until February has lost most of the procedural confidence by the time they need it. Second, they are context-free. Generic digital skills training does not tell a procurement lecturer how to run a CIPS-aligned simulation session on supplier risk assessment in a 90-minute slot with 45 students.
What actually works is embedded support — tools that are so well-designed that the pedagogical scaffolding is built into the platform itself. SPPIN Sim does this by providing ready-made simulation modules for 16 business disciplines, each with its own briefing materials, decision structures, and assessment rubric templates. The tutor does not need to design the experience from scratch. They need to select the right module, configure it for their cohort, and run it. The platform handles the complexity; the educator handles the facilitation.
The Confidence Cascade
One of the more interesting findings in digital transformation research is the speed at which confidence propagates once it is established. An educator who runs one successful simulation session — where students are visibly engaged, the technology works without drama, and the assessment evidence is automatically generated — becomes an advocate. They recommend the tool to colleagues. They present the session at a programme board meeting. They propose it as a standard element of the module. This is the confidence cascade, and it is the most cost-effective form of digital transformation available to a university.
The implication for heads of department is to identify the right first mover — someone with reasonable technical confidence, a cohort that would benefit from active learning, and a module structure that allows a 90-minute simulation slot — and support them to run a first session well. The institutional investment required is minimal. The downstream effect on adoption across the department can be substantial.
What Good Educator Support Looks Like in Practice
The tutor control panel in SPPIN Sim illustrates what genuinely educator-centred design looks like. Tutors can open and close turns, approve or reject AI-generated world events before they go live, release or hold the leaderboard, and monitor team progress in real time — all from a single dashboard. They are never in a position where the simulation runs away from them or produces something pedagogically inappropriate. Professional control always remains in the educator's hands.
This matters for the skill gap problem because one of the most common reasons educators resist adopting new tools is loss of control. A lecture is predictable. A badly designed digital activity is not. When the tool is designed to keep the tutor in command — rather than turning them into a passive observer while software does something unpredictable — the adoption barrier drops significantly.
The Institutional Mandate
Programme directors and heads of school who are serious about digital transformation need to make educator capability a funded priority rather than an expectation. That means ring-fencing time for educators to learn and practise new methods, choosing tools that minimise friction at the point of use, and measuring adoption at the module level rather than just the institutional level. The research is unambiguous: educator skill gaps are not a secondary concern to be addressed after the technology strategy is settled. They are the primary concern. Address them first and the technology strategy becomes far easier to execute.
See SPPIN Sim live — book a free demo
Discover how SPPIN Sim is designed from the ground up for educator confidence — with 16 ready-made modules, full tutor controls, and automated assessment evidence that make the first session as easy as the tenth.
See SPPIN Sim live — book a free demo