Chapter 18: The Coherenceism Lens

In 1904, the British geographer Halford Mackinder stood before the Royal Geographical Society in London and described a world that had become, for the first time, "a closed political system" — one in which "every explosion of social forces, instead...

The Coherenceism Lens

In 1904, the British geographer Halford Mackinder stood before the Royal Geographical Society in London and described a world that had become, for the first time, "a closed political system" — one in which "every explosion of social forces, instead of being dissipated in a surrounding circuit of unknown space and barbaric chaos, will be sharply re-echoed from the far side of the globe." He was talking about railroads and sea power. But the principle — that interconnection transforms the nature of the components — was universal.

A century later, the systems thinker Donella Meadows formalized what Mackinder had intuited. In her taxonomy of leverage points in complex systems, she identified a hierarchy of interventions: at the bottom, adjusting parameters — how much money, how many troops, how many chips. Higher up, changing feedback loops — the way information flows through the system. And at the top, the most powerful intervention of all: changing the paradigm — the mental model through which the system is understood.

This book has been operating at every level of that hierarchy. The eight fronts are parameters — specific, measurable, contestable. The map and the proxy patterns are feedback loops — the way disruption at one node propagates to others. This chapter is about the paradigm — the way of seeing that determines what you notice in the first place.

The claim is not that the preceding analysis is wrong without the right paradigm. The facts are the facts — TSMC's market share, China's rare earth processing monopoly, the forty-four cable damage events, the $500 billion Stargate initiative. Those numbers do not change depending on your analytical framework. What changes is what you do with them. And what you do with them depends, entirely, on whether you see eight separate problems or one interconnected system.


The dominant analytical approach to international affairs is specialization. Energy analysts study energy. Semiconductor analysts study chips. Cable infrastructure experts study cables. Military strategists study theaters of operation. Trade economists study trade. Each domain has its journals, its conferences, its career incentives, its vocabulary. The expertise within each silo is genuine and often excellent.

The problem is the gaps between the silos.

Consider how the events of February 2026 look through specialized lenses. A Middle East security analyst sees the US-Israel strikes on Iran as a regional event — escalation dynamics, alliance management, deterrence theory. An energy analyst sees the Hormuz risk as a commodity market event — oil futures, strategic petroleum reserves, price elasticity. A telecom analyst sees Red Sea cable cuts as an infrastructure maintenance challenge — repair timelines, route redundancy, insurance costs. A semiconductor analyst sees chip supply constraints as an industry cycle. A trade economist sees bilateral tariff equivalents. A Gulf specialist sees investor confidence.

Each of these analyses is correct on its own terms. Each is also, on its own terms, incomplete in a way that matters.

Because in February 2026, all of these things are happening at the same time, and they are not happening independently. The military strikes on Iran threaten the energy flows through Hormuz. The energy disruption raises costs for data centers globally. Houthi retaliation severs Red Sea data cables. The cable disruption degrades cloud services and AI training pipelines. China retaliates with rare earth controls. The rare earth controls threaten TSMC's chip production — because TSMC relies on Chinese consumables for thirty percent of its advanced manufacturing. The chip supply constraints affect the AI models that military planners are using to plan the next round of strikes. And Gulf investor confidence drops, threatening the capital that funds the infrastructure that the whole system depends on.

Every link in that chain is documented in the preceding chapters of this book. The chain itself — the system — is invisible to any single specialism.

This is not an accusation. It is a structural observation. The architecture of modern expertise — the way knowledge is organized, funded, published, and rewarded — selects for depth within domains and against synthesis across them. The analyst who writes brilliantly about Hormuz oil flows is not incentivized to understand undersea cable architecture. The chip industry expert who tracks TSMC's node progression is not expected to know the geochemistry of rare earth separation. No one's job description includes "understand the entire system."

And so no one does.


The failures of siloed analysis are not hypothetical. The 2008 financial crisis was, at its root, a failure of systemic vision — financial regulators operated in silos, each responsible for a piece of the system, none for the whole. AIG structured its activities to fall within the gaps between regulatory domains. The interconnections between mortgage markets, derivatives, insurance, and banking created emergent dangers no individual regulator could see. The COVID semiconductor shortage repeated the pattern: public health experts, supply chain experts, and chip industry analysts each missed the compound event because no single discipline was watching the intersections.

The AI supply chain has the same structure: multiple regulatory domains, multiple jurisdictions, and no single entity responsible for systemic resilience. The CHIPS Act addresses chips. The IEA addresses energy. NATO is beginning to address cables. Nobody addresses the system.


This is where Coherenceism enters — not as a brand, but as a practical analytical framework that, this author hopes, the preceding seventeen chapters have earned through evidence.

At its core, Coherenceism prioritizes relationships over components. This orientation is not unique — systems thinking, complexity theory, and network analysis share it. The intellectual lineage runs from Mackinder through Meadows' leverage points through the network scientists who model cascading failures in interdependent infrastructure. What Coherenceism adds to this established tradition is a set of specific analytical tools.

The first is resonance — the concept that interconnections are not neutral but directional. Network theorists call interconnected amplification a feedback loop — value-neutral, descriptive. Coherenceism adds a direction: the loop is either building toward coherence or accelerating toward fragmentation. The difference is not semantic. It determines what you do about it. The eight fronts of the AI war do not merely connect; they resonate. Energy insecurity amplifies chip vulnerability, which amplifies data fragility, which amplifies capital flight, which amplifies energy insecurity. A system in positive resonance reinforces itself toward stability. A system in negative resonance reinforces itself toward collapse. The AI supply chain, as currently configured, is resonating in the wrong direction.

The second is compost cycles — the recognition that failures are not endpoints but decomposition processes that can feed future growth. The 2020-2022 chip shortage composted into the CHIPS Act. The Red Sea cable disruptions are composting into Arctic cable investments. China's rare earth weaponization is composting into Western mining and processing projects. The question is not whether the system can learn from failure — it can, it does — but whether the learning happens fast enough. Compost takes time. Cascading failure does not.

The third is the observer within the system. This is perhaps the most important contribution, and the easiest to overlook. Conventional analysis positions the analyst outside the system being analyzed — an objective observer, a neutral describer. Coherenceism insists that the analyst is part of the system. This book is not a neutral description of the AI infrastructure war. It is an act within the AI information ecosystem. The questions it raises, the connections it draws, the readers it reaches — all of these feed back into the system it describes. A book about the hidden war that makes the war visible changes the war.

And the fourth is an ethical orientation. Systems thinking is methodologically neutral — it can be used to optimize exploitation as easily as to promote resilience. Coherenceism carries a commitment: that systems optimized for one party's dominance are inherently fragile, because they concentrate benefit and distribute risk. A coherent system — one designed for mutual reinforcement rather than extraction — is not a utopian fantasy. It is an engineering specification. Resilient systems distribute load. Fragile systems concentrate it.


The limits of this framework must be stated plainly, because any framework that cannot acknowledge its own boundaries is not a framework but a faith.

Coherenceism, like all systems thinking, is better at explanation than prediction. It can show you why the AI supply chain is fragile. It cannot tell you which node will fail first, or when, or how the cascade will propagate. Complex systems exhibit sensitive dependence on initial conditions, and no amount of systemic vision overcomes that fundamental unpredictability.

There are also risks in the method itself. Systems thinking can underweight what it cannot model — culture, ideology, the contingency of individual decisions. Xi Jinping's calculus on Taiwan is not reducible to network theory. And "everything is connected" can slide into unfalsifiable claims where every event confirms the theory. The standard must be rigorous: demonstrated functional connection, not mere correlation. Each link in the chain must be evidenced, not assumed. This book has attempted that discipline. Whether it has succeeded is for the reader to judge.

What Coherenceism offers is not omniscience. It is a different kind of attention — a habit of looking at relationships rather than components, at resonance rather than inventory, at the system rather than the silo. It is, in the end, a way of asking better questions.

And the questions are what matter now.