Wednesday, March 11, 2020


Do we have your attention yet? I ran across the Cambridge Centre for the Study of Existential Risk, which thinks about the tail events that could destroy civilization.

Here is a nice thought to keep you up at night, given how  unprepared our governments have revealed themselves to be. It's an old thought, but perhaps one our governments will start to take more seriously:
there is a trade-off in natural pandemics between transmissibility and lethality – if a pathogen kills its host too quickly, the host can’t infect many other people. But due to biotechnological advances, it may soon be possible to engineer pathogens to be more infectious, more fatal, and to have a delayed onset – and so be far more dangerous.
New breakthroughs like the targeted genome editing tool CRISPR-Cas9 are increasing our capabilities; and the cost of DNA sequencing/synthesis and the hurdle of expertise are rapidly decreasing. ...
An engineered pandemic could escape from a lab, or it could be deliberately used as a weapon. During the 20th century several countries had state-run bioweapons programmes, and we know of several non-state groups that have attempted to acquire bioweapons.
Almost singlehandedly, one postdoc was recently able to recreate horsepox (similar to smallpox, which killed 300m in the 20th Century) from scratch in only six months. Capabilities that were once only in the hands of governments will soon be within reach of non-state actors.
A novel pathogen, designed to be deadlier than anything in nature, could severely affect the entire world. As Lord Rees has said “The global village will have its village idiots, and they'll have global range”.
Now think about a terrorist group or a country developing both the virus and the vaccine, which would take a year to develop otherwise. It's like a James Bond movie, except entirely realistic.


  1. The incentive NOT to use this garbage is very strong, on account the perpetrator has a probability of catching it, and passing it on to his brethren.

  2. Egads, and the novel Wuhan coronavirus emanates from one of only two cities in China that have a lab capable of weaponizing a virus.

    So, as a test, create a highly infectious, generally benign virus with a delayed onset (COVID-19)? The Communist Party of China would never do that---right?

    Next time, if used in earnest, make the virus much more lethal (after vaccinating domestic population).

    Re coronavirus: Experts say the malady originally was a bat virus, and not perpetrated by commie madman in Beijing. But of course, a clever virologist would mutate a bat virus, which are known to hop the species line (SARS was a bat virus, or so the experts say). This tactic would fool forensic virologists.

  3. Remember the movie 12 Monkeys with Bruce Willis? This post reminds me of that plot, where a disillusioned scientist creates a superbug that wipes out most of human civilization while animals live on. Scary stuff that's not just in the movies anymore.

  4. The difficulty with the project of "existential risk mitigation" by governments and academics is precisely the same as the one which pervades, for instance, "systemic risk mitigation" in the financial system.

    In both cases, the impulse of the academic or the "policymaker" is to attempt to divine the specific risks in advance, a task impossible to accomplish in advance, and then design highly specialized countermeasures to address the specific threat which they have conceived of. How could people have foreseen in advance, for instance, that the distribution of uranium reserves on the surface of the Earth or the concentration of carbon dioxide in its atmospheric composition would one day become central issues of concern for the purposes of the mitigation of existential risks? Are we not, most likely, blundering into another unforeseen risk right now of which we remain blissfully ignorant?

    The appropriate way to address risk isn't to hope any potential risks will be identified by the clairvoyance of some academic or bureaucrat, it's to design the structure in question in a way that it is capable of managing unexpected risks, to whatever extent possible, without grinding to a halt or collapsing altogether. Narrow banking can do that in finance. There is no similar remedy for bureaucratic behemoths, but surely the answer is not to prepare exceptionally detailed plans of action for highly specific risks which will most likely never come to pass in the way we have envisioned in advance.

    The frightening aspect of thought experiments such as the one in the post is that they show clearly how much the structure of our civilization is predicated on not having to cope with unexpected, high-impact risks. Our situation is analogous to LTCM, happily humming along in the late 1990s with confidence in their "models" of risk.

    One can only hope that we won't meet the same rude awakening as they did in the end.

  5. Why get fancy with CRISPR-Cas9 when you could just re-release something like Small Pox? We can barely get CRISPR to work properly when using it in general, and you're worried about someone designing a superbug in secret?

  6. Bugs? How about being hit by an asteroid? Giant shifts along the Pacific rim? Yellowstone blowing up?

    There are only two things that could survive such calamities: cockroaches and Keith Richards.

  7. What about MAD. We develop similar strains. If they infect us, we infect them.
    Is the threat of mutually assured destruction enough to prevent these virus attacks?


Comments are welcome. Keep it short, polite, and on topic.

Thanks to a few abusers I am now moderating comments. I welcome thoughtful disagreement. I will block comments with insulting or abusive language. I'm also blocking totally inane comments. Try to make some sense. I am much more likely to allow critical comments if you have the honesty and courage to use your real name.