Catastrophe
Latest Publications


TOTAL DOCUMENTS

6
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780195178135, 9780197562444

Author(s):  
Richard A. Posner

You wouldn’t see the asteroid, even though it was several miles in diameter, because it would be hurtling toward you at 15 to 25 miles a second. At that speed, the column of air between the asteroid and the earth’s surface would be compressed with such force that the column’s temperature would soar to several times that of the sun, incinerating everything in its path. When the asteroid struck, it would penetrate deep into the ground and explode, creating an enormous crater and ejecting burning rocks and dense clouds of soot into the atmosphere, wrapping the globe in a mantle of fiery debris that would raise surface temperatures by as much as 100 degrees Fahrenheit and shut down photosynthesis for years. The shock waves from the collision would have precipitated earthquakes and volcanic eruptions, gargantuan tidal waves, and huge forest fires. A quarter of the earth’s human population might be dead within 24 hours of the strike, and the rest soon after. But there might no longer be an earth for an asteroid to strike. In a high-energy particle accelerator, physicists bent on re-creating conditions at the birth of the universe collide the nuclei of heavy atoms, containing large numbers of protons and neutrons, at speeds near that of light, shattering these particles into their constituent quarks. Because some of these quarks, called strange quarks, are hyperdense, here is what might happen: A shower of strange quarks clumps, forming a tiny bit of strange matter that has a negative electric charge. Because of its charge, the strange matter attracts the nuclei in the vicinity (nuclei have a positive charge), fusing with them to form a larger mass of strange matter that expands exponentially. Within a fraction of a second the earth is compressed to a hyperdense sphere 100 meters in diameter, explodes in the manner of a supernova, and vanishes. By then, however, the earth might have been made uninhabitable for human beings and most other creatures by abrupt climate changes.


Author(s):  
Richard A. Posner

To deal in a systematic way with the catastrophic risks identified in chapter 1 requires first assessing them and then devising and implementing sensible responses. Assessment involves first of all collecting the technical data necessary to gauge, so far as that may be possible, the probability of particular risks, the purely physical consequences if the risks materialize (questions of value are for later), and the feasibility of various measures for reducing either the risks or the magnitude of the consequences by various amounts. The next step in the assessment stage is to embed the data in a cost-benefit analysis of the alternative responses to the risk. I am not proposing that cost-benefit analysis, at least as it is understood by economists, should be the decision procedure for responding to the catastrophic risks. But it is an indispensable step in rational decision making in this as in other areas of government regulation. Effective responses to most catastrophic risks are likely to be extremely costly, and it would be mad to adopt such responses without an effort to estimate the costs and benefits. No government is going to deploy a system of surveillance and attack for preventing asteroid collisions without a sense of what the system is likely to cost and what the expected benefits (roughly, the costs of asteroid collisions that the system would prevent multiplied by the probabilities of such collisions) are likely to be relative to the costs and benefits both of alternative systems and of doing nothing. The “precautionary principle” (“better safe than sorry”) popular in Europe and among Greens generally is not a satisfactory alternative to cost-benefit analysis, if only because of its sponginess—if it is an alternative at all. In its more tempered versions, the principle is indistinguishable from cost-benefit analysis with risk aversion assumed. Risk aversion, as we know, entails that extra weight be given the downside of uncertain prospects. In effect it magnifies certain costs, but it does not thereby overthrow cost-benefit analysis, as some advocates of the precautionary principle may believe.


Author(s):  
Richard A. Posner

I have said that the risk of catastrophe is growing because science and technology are advancing at breakneck speed. Oddly, this is a source of modest comfort. We do not know what the cumulative risk of disaster is today, but we know that it will be greater several decades from now, so there is time to prepare measures against the truly terrifying dangers that loom ahead. But we must begin. And the formulation and implementation of the necessary measures cannot be left to scientists, as we know. The role of law and the social sciences is crucial. The law, however, is making little contribution to the control of catastrophic risks. Likewise the social sciences, with the partial exception of economics, which has produced a significant scholarly literature on global warming. The legal profession may even be increasing the probability of catastrophe by exaggerating the cost to civil liberties of vigorous responses to threats of terrorism. Improvement in the response to catastrophic risks may require both institutional reforms and changes in specific policies, procedures, and doctrines. The legal system cannot deal effectively with scientifically and technologically difficult questions unless lawyers and judges—not all, but more than at present—are comfortable with such questions. Comfortable not in the sense of knowing the answers to difficult scientific questions or being able to engage in scientific reasoning, but in the sense in which most antitrust lawyers today, few of whom are also economists, are comfortable in dealing with the economic issues that arise in antitrust cases. They know some economics, they work with economists, they understand that economics drives many outcomes of antitrust litigation, and as a result they can administer—not perfectly but satisfactorily— an economically sophisticated system of antitrust law. Economics, however, although at least quasi-scientific in method and outlook, and increasingly mathematized, is easier for lawyers and judges to get comfortable with than the natural sciences are. Because it plays an important role in many fields of law, economics is taught in law schools, whether in special courses on economic analysis of law or more commonly as a component of substantive law courses, such as torts, antitrust, securities regulation, environmental law, and bankruptcy.


Author(s):  
Richard A. Posner

I have said that the dangers of catastrophe are growing. One reason is the rise of apocalyptic terrorism. Another, however—because many of the catastrophic risks are either created or amplified by science and technology—is the breakneck pace of scientific and technological advance. A clue to that pace is that between 1980 and 2000 the average annual growth rate of scientific and engineering employment in the United States was 4.9 percent, more than four times the overall employment growth rate. Growth in the number of scientific personnel of the other countries appears to have been slower, but still significant, though statistics are incomplete. Of particular significance is the fact that the cost of dangerous technologies, such as those of nuclear and biological warfare, and the level of skill required to employ them are falling, which is placing more of the technologies within reach of small nations, terrorist gangs, and even individual psychopaths. Yet, great as it is, the challenge of managing the catastrophic risks is receiving less attention than is lavished on social issues of far less intrinsic significance, such as race relations, whether homosexual marriage should be permitted, the size of the federal deficit, drug addiction, and child pornography. Not that these are trivial issues. But they do not involve potential extinction events or the modestly less cataclysmic variants of those events. So limited is systematic analysis of the catastrophic risks that there are no estimates of what percentage either of the federal government’s total annual research and development (R & D) expenditures (currently running at about $120 billion), or of its science and technology expenditures (that is, R & D minus the D), which are about half the total R & D budget, are devoted to protection against them. Not that R & D is the only expenditure category relevant to the catastrophic risks. But it is a very important one. We do know that federal spending on defense against the danger of terrorism involving chemical, biological, radiological, or nuclear weapons rose from $368 million in 2002 (plus $203 million in a supplemental appropriation) to more than $2 billion in 2003.


Author(s):  
Richard A. Posner

The number of extreme catastrophes that have a more than negligible probability of occurring in this century is alarmingly great, and their variety startling. I want to describe them and in doing so make clear the importance of understanding what science is doing and can do and where it is leading us. I begin with the natural catastrophes and move from there to the man-made ones, which I divide into three groups: scientific accidents, other unintended man-made catastrophes, and intentional catastrophes. The 1918–1919 flu pandemic is a reminder that nature may yet do us in. The disease agent was an unexpectedly lethal variant of the commonplace flu virus. Despite its lethality, it spread far and wide because most of its victims did not immediately fall seriously ill and die, so they were not isolated from the healthy population but instead circulated among the healthy, spreading the disease. No one knows why the 1918–1919 pandemic was so lethal, although it may have been due to a combination of certain features of the virus’s structure with the crowding of troops in the trenches and hospitals on the Western Front (where the pandemic appears to have originated near the end of World War I), facilitating the spread of the disease. The possibility cannot be excluded that an even more lethal flu virus than that of the 1918–1919 pandemic will appear someday and kill many more people. There is still no cure for flu, and vaccines may be ineffective against a new mutant strain—and the flu virus is notable for its high rate of mutations. Another great twentieth-century pandemic, AIDS, which has already killed more than 20 million people, illustrates the importance to the spread of a disease of the length of the infectious incubation period. The longer a person is infected and infectious yet either asymptomatic or insufficiently ill to be isolated from the healthy population, the farther the disease will spread before effective measures, such as quarantining, are taken.


Author(s):  
Richard A. Posner

To summarize very briefly: The risks of global catastrophe are greater and more numerous than is commonly supposed, and they are growing, probably rapidly. They are growing for several reasons: the increasing rate of technological advance—for a number of the catastrophic risks are created or exacerbated by science and its technological and industrial applications (including such humble ones as the internal combustion engine); the growth of the world economy and world population (both, in part, moreover, indirect consequences of technological progress); and the rise of apocalyptic global terrorism. And the risks are, to a degree, convergent or mutually reinforcing. For example, global warming contributes to loss of biodiversity, an asteroid collision could precipitate catastrophic global warming and cause mass extinctions, and cyberterrorism could be employed to facilitate terrorist attacks with weapons of mass destruction. Each catastrophic risk, being slight in a probabilistic sense (or seeming slight, because often the probability cannot be estimated even roughly) when the probability is computed over a relatively short time span, such as a year or even a decade, is difficult for people to take seriously. Apart from the psychological difficulty that people have in thinking in terms of probabilities rather than frequencies, frequencies normally provide a better grounding for estimating probabilities than theory does; frequent events generate information that enables probabilities to be confirmed or updated. The fact that there have been both nuclear attacks and, albeit on a very limited scale, bioterrorist attacks—which, however, resemble natural disease episodes, of which the human race has a long experience—has enabled the public to take these particular risks seriously. The general tendency, however, is to ignore the catastrophic risks, both individually and in the aggregate. Economic, political, and cultural factors, including the religious beliefs prevalent in the United States, reinforce the effect of cognitive factors (including information costs) in inducing neglect of such risks. The neglect is misguided. The expected costs of even very-low-probability events can be huge if the adverse consequences should the probability materialize are huge, or if the interval over which the probability is estimated is enlarged; the risk of a catastrophic collision with an asteroid is slight in the time span of a year, but not so slight in the time span of a hundred years.


Sign in / Sign up

Export Citation Format

Share Document