Six Dimensions of Catastrophic Risk
In 2015, a major C-PET theme will be risk.
Specifically, the value of risk-focused thinking as a guide to the development of policy and its priorities. In particular, six issues intersecting with policy on science and technology in which framing the question in risk terms would shed fresh light and aid in decision-making. Since Washington is pre-eminently a rational and long-term-focused community, we expect such an approach to commend itself widely.
Each of these questions is one in which a certain possible outcome would be catastrophic. The catastrophes are of different kinds. And different likelihoods. But none is impossible. The degree to which we can reduce their likelihood, or impact, or both, are matters that should greatly preoccupy our leaders. There is, sadly, little evidence that they do.
In the new year we plan to convene discussions on each of these questions. We hope you will join us.
1. Antibiotics. The leader of the World Health Organization has shared her concern that we may enter a post-antibiotic era. Our children will once again die of diphtheria.
2. Asteroids. There is a small but rational possibility that a profoundly destructive strike could incapacitate or destroy civilization.
3. Robotics. From Ricardo to Keynes, economists have expressed concern that technology could remove the labor factor from much value creation. Larry Summers and Bill Gates recently added their voices. Here is my
San Francisco Chronicle op-ed on the subject.
4. Artificial Intelligence. Elon Musk, who has been suggested as Edison's true successor in the 21st century, has called it a "demon;" Stephen Hawking has expressed a similar view. The prospect of the Singularity is not seen by everyone as wonderful.
5. EMP. An Electro-Magnetic Pulse, caused by terrorists, an enemy state, or the Sun, could destroy much of our infrastructure.
6. Climate change. Here is the sole issue of these six that has attracted serious policy consideration. It should be recast as a risk issue.
I have proposed in the past that presidential debates should include one on risk, in which candidates stand before a basic 2x2 risk matrix, assign likelihood and impact to a series of risks, and explain how they plan to mitigate them. In each of these cases, under a certain scenario the impact would be catastrophic. And none of them is "sci-fi" in character; all are seen soberly by serious people as possible.
Yours soberly,
Nigel Cameron