Early Failure Signals: How Systems Reveal Strain Before Collapse
Applied Kinetic Humanism #6
This is one of the system pieces. If you are new here, the more grounded posts are usually a better place to begin.
A system does not fail at the moment it breaks. It fails in the small interval after strain rises and before anyone recognises that the shape has begun to change. Most of this happens quietly. Load accumulates in places nobody is watching. Routines loosen without anyone consciously relaxing them. Early drift spreads through areas of the structure that previously held their form without effort. Outwardly the day still looks organised. The surface stays calm even as the foundation begins to shift.
This piece follows directly from the previous entry on load. If load is the pressure that weakens coherence, then early failure signals are the first signs that the system’s built-in adjustments are no longer enough to keep things steady. They are the indicators that the structure can still function but no longer corrects itself as easily as it once did.
Systems rarely collapse suddenly. They begin by showing a small group of changes. These changes appear across different fields, from Diane Vaughan’s research on weakening standards to James Reason’s descriptions of latent conditions that build quietly before they matter. The signals are consistent, even if they surface differently depending on the environment.
Loss of Resistance
Healthy systems contain a small amount of resistance. This resistance is not friction. It is the effort that keeps behaviour aligned. Equipment returns to its proper place rather than staying where it was last used. A handover includes the information that prevents tomorrow’s work from starting at a disadvantage. This resistance is what turns coherence from a concept into something the system actively maintains.
When strain rises, that resistance weakens. Vaughan’s work on normalisation shows how protective checks are often the first to soften. Nobody consciously lowers standards. It simply feels easier not to perform the small adjustment. Tasks feel smoother. Minor resets are skipped. Information that once mattered is left unsaid because the next action feels more urgent.
The ease creates a false sense of improvement. It appears because the system has stopped resisting drift. James Reason observed that systems often “make the unsafe normal” through repeated micro-skips in correction. When resistance falls, behaviour follows the path of least effort. That path may feel efficient in the moment, but it also allows early drift to settle in.
Silent Compensation
People compensate long before they complain. Gary Klein’s studies on real-world decision making show how individuals quietly correct weak points to preserve flow. They do not announce this. They simply absorb the gap. Someone does extra hours without mentioning why. Someone reorganises a cluttered area because it is quicker than raising the issue. These actions appear helpful, and often they are, but they hide the early signs of strain.
Silent compensation also appears when tasks begin to settle on the same person. A volunteer checks the donation bin more often because nobody else notices it filling. A member of staff rewrites unclear labels because the system no longer produces clarity on its own. These behaviours keep the day moving, but they disguise the imbalance underneath.
Karl Weick and Kathleen Sutcliffe describe this as the “hidden work of stability.” The system appears normal because someone is picking up what the structure can no longer hold. By the time anyone sees the extra effort, the underlying pattern has already shifted.
Time Distortion
Time is often the first dimension to distort. David Kahneman’s work on cognitive load shows that even small increases in effort narrow attention and slow correction long before people notice the change. Systems show the same behaviour.
A task that always took ten minutes now takes twelve. Queues form earlier in the day. Transitions stretch. The atmosphere feels slightly behind itself. The difference rarely feels alarming. It feels like a minor variation. Most people cannot name when this shift began because the pattern only becomes visible when compared with the rhythm the system used to hold.
When time no longer behaves as expected, drift is already present. The structure is working harder to produce the same outcome.
Local Drift
Drift begins in small pockets. Donella Meadows noted that when feedback weakens, minor deviations remain uncorrected and begin to accumulate. A storage area that used to stay organised now falls out of order after every shift. A daily check that once kept things aligned is now done irregularly. A routine meeting drifts from its original purpose and never fully returns to it.
These pockets behave independently at first. Over time they begin to influence one another. A backlog in one part of the workflow delays the next stage. A shift in how tasks are logged slows someone else downstream. Per Bak’s work on self-organised criticality describes the same pattern in physical systems: quiet local disturbances accumulate until they force a broader reorganisation.
Local drift becomes system drift when the pockets connect. At that point, restoring coherence requires far more energy than the small correction that would have prevented the spread.
Pattern Softening
Stable systems rely on predictable sequences. Prepare, process, hand over, complete, reset. These sequences act as guardrails that keep behaviour aligned even when the day becomes busy. When load rises and drift gains ground, these sequences soften. Handover steps become optional. The order in which tasks occur varies. Shared expectations lose clarity. Meetings that once set direction become brief recaps of what has already happened.
Weick observed that under rising ambiguity organisations shift from structure to improvisation. Pattern softening is often mistaken for flexibility. It is not. It is the moment where the system stops shaping behaviour and begins accepting whatever behaviour surfaces next. Once the underlying sequence loosens, the day becomes harder to steer back onto its original line because the line is no longer clear.
Diminished Feedback Sensitivity
Attention narrows under strain. Noise rises. Feedback loses resolution. Grace Lindsay’s work on signal processing shows how noise obscures weak signals long before a system realises what is missing. Human environments behave the same way. Signals that once appeared clear become faint or ambiguous.
Quietness usually reveals this. When the environment settles, small imbalances surface quickly. If the system does not reconnect with those signals during quieter intervals, the drift visible in low noise becomes the behaviour that dominates in high noise.
Feedback failure is gradual. It is the slow reduction in a system’s ability to recognise itself.
When Early Failure Becomes Direction
The critical moment in early failure is not collapse. It is normalisation. Shortcuts become routine. Slower timing becomes acceptable. Compensation becomes expected. Standards drift further without being noticed. What began as strain becomes the system’s new baseline.
Once drift becomes direction, recovery requires far more energy than the minor correction that would have prevented the shift. Early failure is not the point where damage occurs. It is the point where the system accepts the behaviour that leads to it.
Closing Reflection
A system does not reveal its failures through noise. It reveals them through subtle changes that resemble normal variations in pace, attention and effort. They are the moments where resistance falls, compensation spreads, time stretches and the structure quietly reorganises itself around lower expectations.
Early failure begins the moment the structure forgets how to hold its own shape. The next piece will look at how to restore that shape before drift becomes direction.
Further Reading
Donella Meadows, Thinking in Systems (2008).
James Reason, Human Error (1990).
Diane Vaughan, The Challenger Launch Decision (1996).
Karl Weick, Sensemaking in Organizations (1995).
Karl E. Weick and Kathleen Sutcliffe, Managing the Unexpected (2001).
Gary Klein, Sources of Power (1998).
Grace W. Lindsay, Models of the Mind (2021).
Per Bak, How Nature Works (1996).
David Kahneman, Thinking, Fast and Slow (2011).
Notes
Header photo by Patrick Perkins on Unsplash

