The Anti-Preparedness Paradox
March 15, 2025Consider two civilizations, Civilization A and Civilization B, with different strategies towards mitigating asteroid impacts.
- Civilization A is highly concerned about the possibility of asteroid impacts. They invest heavily in asteroid monitoring and deflection technologies to prevent such impacts from occurring (let's call this a prepared strategy).
- Civilization B is not concerned about asteroid impacts at all. They invest zero resources into detecting or mitigating such risks, considering them too unlikely to be worth worrying about (a neglectful strategy).
Let's suppose there is some baseline probability of an asteroid impact in any given year. What will these civilizations observe over time?
- Civilization A periodically detects and deflects incoming asteroids, preventing any impacts from occurring. Therefore, they evaluate their strategy as having been successful.
- Civilization B takes no actions to prevent asteroid impacts. If no impact occurs, they evaluate their strategy as having been successful. If an impact does occur, the civilization is extinguished and can no longer evaluate its strategy.
This leads to a strange sort of "anti-preparedness paradox," where failure to prepare for a risk makes the risk appear less likely. The key dynamic at play is that extinction events remove observers - therefore, there is an observer bias towards only observing successful strategies. (This is similar to the concept of Anthropic Shadow, but in the context of resource allocation instead of baseline probability assessment).
The existence of this paradox implies that we should be highly skeptical of neglectful strategies towards existential risks, as such strategies may appear to be more successful "from the inside" than they really are.
Last updated: March 16, 2025