In a recent call to arms, software magnate Marc Andreessen remarked on widespread inability to respond swiftly, let alone appropriately, to the onset of pandemic:

[...] the harsh reality is that it all failed — no Western country, or state, or city was prepared — and despite hard work and often extraordinary sacrifice by many people within these institutions. So the problem runs deeper than your favorite political opponent or your home nation.

I propose an explanation: Civilization-level threats, small or large, are dealt with by the "civilizational immune system" at hand. This emergent lattice of societal risk management finds its thousandfold cruxes in human perception(s) and understanding(s) of our environment. Plus what we do about them, which is crucial.

Much like the standard Homo sapiens model, our civilizational immune system is decentralized, composed of many individuals and clumps thereof that possess both the capability and means of action. Its memory stretches back a handful of generations, encompassing whatever the proximal practitioners (leukocyte equivalents) have encountered in their own lives, along with the gestalt metis that was conveyed by their mentors.


During late February and early March of this year, it became clear that many people, intelligent people who would gladly acknowledge the danger and dynamics of infectious disease in abstract, were ill-equipped to accept that it could happen, for real, to them personally. A communally devastating plague that spreads to, then within, the First World? Gracious, what fanciful paranoia!

Even "experts" (writ large) did not conceive that, say, the United States Surgeon General would instruct Americans to become T-shirt ninjas in order to safely — in many places, lawfully — leave their homes. Yet evidently such an outcome was imminent, given that it happened! Ironically, even the overeducated will trust their gut's synthesis of personal experience and manufactured consent over, well, anything else.

Should the early popular reaction be a surprise? When prospects of pandemic were last bandied about in the public sphere, we regular folks were never asked to stay home from work. Nobody asked us to do anything in particular at all! The uproar simply petered out and everyone went on their merry way. (If you got swine flu, my condolences.)

All that anxiety for nothing, then you demand us to believe it's Real This Time™? Empirical evidence that I can observe directly — such as local authorities ordering shutdown — or GTFO. (That said, YMMV.)


By now it's cliché to point out that several Asian countries were decisively cautious, making good calls quickly, probably due to recent experience with a rapacious respiratory infection. Huh... thank you, SARS? When faced with exponential growth, time is cheaper the sooner you start buying it!

State capacity and willingness to use it were key, but so was the populace being aware that viruses like COVID-19 are 1) a thing, 2) bad enough to be worth averting or avoiding at high cost.

What if it is indeed that simple? What if:

The more time has passed since a given type of tail risk last afflicted a given society, the more vulnerable the society will be to that tail risk. And the more materially devastated it'll be, relative to peers with situation-adapted civilizational immune systems, when that risk actually happens.

This nth-order phenomenon is a tradeoff of status quo bias and normalcy bias, which I posit are rational:

The West hasn't dealt with a society-disrupting pandemic in a long while, so our civilizational immune system wasn't prepared for it. The data wasn't in the training set, if you will.

Granted, a lot of infrastructure that is in place has been taken for granted, and its presence radically sped up response time. Imagine if we remained ignorant of germ theory, or didn't have analytic tools like R0 and CFR versus IFR!


It is critical for our attitudes to be accurate. Not merely accurate with respect to the moment, but predictive of how we ought to act in subsequent moments. To be more precise: How we will end up concluding that we ought to have acted, once everything shakes out. This meta purpose is encoded in our millennia-evolved psychological constructs, optimized for survival and perpetuation — AKA, constituents of the entity we call the mind.

My theory is that humans evolved to dismiss the possibility of whatever they haven't experienced themselves, without any conscious regard to probability. I would argue that our brains block us from explicit evaluation of risk when doing so might otherwise provide a chance to talk ourselves out of an overall protective heuristic.

Should we change? Well, when it comes to optimizing the allocation of scarce resources, I'd bet on evolution in the long run. Let's see you beat that track record 😉


Header photo via NIAID: "Scanning electron micrograph of methicillin-resistant Staphylococcus aureus and a dead human neutrophil."