Future Babble - BestLightNovel.com
You’re reading novel Future Babble Part 7 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy
James Howard Kunstler is an American but he doesn't like much about the United States. A journalist, social critic, and novelist, Kunstler is famous for his firecracker writing style and his bleak views on suburbia, consumerism, and popular culture. "Our practices and habits in place-making the past half-century have resulted in human habitat that is ecologically catastrophic, economically insane, socially toxic, spiritually degrading, and fundamentally unsustainable," he wrote in a typically frenzied burst. "We have built a land of scary places and become a nation of scary people."
Kunstler first heard about the Y2K computer glitch in 1998. It was a revelation. "For five years, I have been flying around the country telling college lecture audiences and conference-goers that our f.u.c.ked-up everyday environment of strip malls, tract houses, outlet malls, parking lots, and other accessories of the national automobile slum was liable to put us out of business as a civilization," he wrote in an April 1999 essay. "I a.s.serted that the culture growing in this foul medium had gotten so bloated and diseased that it would succ.u.mb sooner rather than later to its own idiot inertia. I still believe that today. It is both a conviction and a wish, because to go on in our current mode would be culturally suicidal." Kunstler had no doubt that the slouching beast that is American society would stumble and collapse into the dirt. He likened it to "evolutionary biology, where organisms achieve their largest scale and greatest complexity at the cost of their ability to adapt to changes in their surroundings. They flourish during periods of extraordinary stability and die off when conditions destabilize. The United Parking Lot of America seemed to me to be just this sort of overgrown, overly complex organism." But what exactly would kill the creature? When Kunstler heard about Y2K, he had his answer. "I now see Y2K as the mechanism that will force events to a tipping point much more quickly and surely."
As we saw earlier, Kunstler's Y2K predictions were very specific and very grim. At a minimum, Y2K would cause a crisis similar to the 1973 oil embargo. But that was just the beginning of the possibilities. There could also be "a worldwide deflationary depression," Kunstler wrote. "I will not be surprised if it is as bad in terms of unemployment and hards.h.i.+p as the 1930s." War and dictators.h.i.+p were distinct possibilities. Kunstler was also clear about the timing. "Y2K will not 'strike' at the midnight hour on 1/1/00," he wrote. "It will unfold fractally as a series of events over the next several years, with accelerating disruptions across the remainder of 1999, a 'spike' of failures around New Year 2000, and a ripple of consequences acc.u.mulating, amplifying, and reverberating for months and even years afterward. I expect problems with business and government to be evident by the middle of 1999."
The problems were not evident by the middle of 1999, nor did they appear when 1999 rolled over into 2000. An elevator or two got stuck, but Y2K did not cause a war, a depression, or a recession. It didn't even crash a plane. Just as the failure of a giant sea to cover most of North America demonstrated that Marian Keech's prediction was wrong, the failure of Y2K to cause any notable disaster would seem to be clear disconfirmation of James Howard Kunstler's forecast. But does Kunstler see it that way? In response to an e-mail, Kunstler fired back an 850-word message that can be summarized in one short sentence: I was right.
"What we've been seeing during the past decade might be understood as a kind of 'meta' event, of which the Y2K episode was an early chapter," he began. "Overall, this meta-event has been about systemic socioeconomic collapse as a result of overinvestments in hypercomplexity. . . . We are seeing now, ten years later, a full playing out of these trends in the collapse of banking and capital finance, and consequently in the real economy of ordinary business activity and households." After some more in this vein, Kunstler connected it to the Millennium Bug. "Y2K was, in my opinion, an early apprehension of the dangers of growing hypercomplexity." In the late 1990s, computerization "came on very quickly and it began to dawn on people paying attention that there might be dangerous unintended consequences in turning over control of so many vital activities to complex machines and their algorithms."
But what exactly does this have to do with the rather straightforward fact that what he predicted would happen did not happen? After two paragraphs about Y2K "in the psycho-social realm," Kunstler acknowledged that "as the 'rollover' date approached, I took the position that we were in for a lot of trouble. The trouble, as it turned out, was averted. This is the part of the story usually overlooked by those who mock the Y2K episode. Billions of dollars were spent, and scores of thousands of man-hours were dedicated, to mitigating this problem. . . . There were no 'cascading' failures of the kind that were most feared. Lots of systems did fail, but not a critical ma.s.s of the largest and most critical ones. The Y2K incident pa.s.sed into history as a joke. I don't think it was a joke. I regard it still as a legitimate potential catastrophe that was averted."
There are several striking elements in Kunstler's response. First, he understates his prediction-he called for considerably more than "a lot of trouble"-and overstates the actual damage done by Y2K. This closes some of the gap between the two, easing the cognitive dissonance.
More important is his claim that Y2K was a catastrophe averted by lots of money and hard work. A key fact Kunstler doesn't mention is that throughout 1998 and 1999, there was a steady stream of reports from corporations and governments saying that they were working hard, spending lots of money, and their systems would not fail in the rollover. In his April 1999 essay, Kunstler scoffed at these claims. "A lot of the information released to the public so far has been self-serving and of questionable value-for instance, reports of Y2K readiness released by government agencies whose chief interests might be 1) covering their own a.s.ses, and 2) trying to quell public concern that could escalate to panic," he wrote. So Kunstler was either wrong about Y2K or he was wrong about Y2K remediation efforts. In either event, he made a mistake. But looking back, he doesn't see any error.
Another omission in Kunstler's response is considerably more revealing. Anyone even slightly familiar with the Y2K issue knows there is a standard response to the claim that Y2K would have been a catastrophe if vast sums hadn't been spent to fix the problem. It is this: Some corporations and countries-notably Italy, Russia, and South Korea-did not spend vast amounts on Y2K remediation. Their efforts were haphazard at best, and all through 1999 these laggards were criticized for putting themselves and others in jeopardy. But on January 1, 2000, they suffered no worse than anyone else. Korea Telecom did "nothing" about Y2K, noted Cambridge University computer scientist Ross Anderson in a BBC interview. "It took the view that, hey, if it breaks we'll fix it. Then you have British Telecom, which was spending five hundred million pounds on bug-fixing. And they couldn't both be right. They were using the same sort of equipment on the same sort of scale. The fascinating thing that we observed in the end is that the Koreans called it right." Whether or not this is conclusive, it is important evidence and anyone who wants to honestly a.s.sess Y2K must deal with it. But Kunstler didn't even acknowledge it. That's willful blindness, which is precisely what a psychologist would expect to see in a highly committed individual suffering the pangs of cognitive dissonance.
Following the failure of his prediction, Kunstler, like Keech before him, became even more convinced of the beliefs that led him to make the prediction. In 2005, he published The Long Emergency, which reads like an expanded version of his Y2K essay, with the one important exception that Y2K itself has vanished. The sole reference comes when Kunstler mocks those who think fears of "peak oil" are "another fantasy brought to us by the same alarmists who said that the Y2K computer bug would bring on the end of the world as we know it." Readers are not told that Kunstler was one of those alarmists.
Whether The Long Emergency is prescient or silly is something that cannot be decided today because the forecasts Kunstler makes extend decades into the future. We can be reasonably sure, however, that no matter what happens, James Howard Kunstler will believe he was right.
ROBERT HEILBRONER.
Anyone who has studied economics, even briefly, has probably come across Robert Heilbroner's The Worldly Philosophers, a study of the lives and ideas of the great economists that has been a staple of economics cla.s.ses since it was first published in 1953. But Heilbroner, an American economist, was also a prolific social critic with a keen interest in figuring out how the future would unfold, and he laid out his predictions in a series of books and essays spanning decades. One is particularly famous-and particularly useful for present purposes.
Heilbroner wrote An Inquiry into the Human Prospect in 1972 and 1973. The mood was one of "puzzlement and despair," he noted. "There is a question in the air, more sensed than seen, like the invisible approach of a distant storm, a question I would hesitate to ask aloud did I not believe it existed unvoiced in the minds of many: 'Is there hope for man?'" Like a G.o.d standing at the crest of Olympus, Heilbroner gazed across history and the globe and rendered his verdict: Not really. "The outlook for man, I believe, is painful, difficult, perhaps desperate, and the hope that can be held out for his future prospects seems to be very slim indeed. Thus, to antic.i.p.ate the conclusions of our inquiry, the answer as to whether we can conceive of the future other than as a continuation of the darkness, cruelty, and disorder of the past seems to me to be no; and to the question of whether worse impends, yes."
The first of the nightmares afflicting humanity is the population bomb. "The demographic situation of virtually all of Southeast Asia, large portions of Latin America, and parts of Africa portends a grim Malthusian outcome," Heilbroner wrote. And don't look to the Green Revolution for salvation. The technical obstacles to growing more food may be insurmountable, and if they were overcome, it would be a disaster-because it would "enable additional hundreds of millions to reach childbearing age," which would make everything worse. Thus, there are only two possible futures. "One is the descent of large portions of the underdeveloped world into steadily worsening social disorder." There would be starvation, stunted children, declining life expectancies. Nations would descend into near anarchy. The alternative, which Heilbroner thought more likely, is the rise of "iron" governments, "probably of a military-socialist cast," that would hold things together with brute force. These governments would not be so forgiving of the gap in wealth between the world's haves and have-nots, Heilbroner believed, making it likely there would be war between the poor world and the rich. And with nuclear weaponry becoming cheap and easy to develop-even terrorists would have nuclear a.r.s.enals-these wars could mean annihilation. But Heilbroner didn't think it would come to that. It was "more plausible" that the dictators of the poor world would engage in nuclear blackmail-threatening to incinerate the rich world if suitably large and regular cash payments were not made.
But that wasn't the worst of it, in Heilbroner's view. A more fundamental threat came from the environment and dwindling resources. Pollution would steadily worsen and demand for oil would soon outstrip supplies. Worse still, the climate would change. Industrial activity generates heat as a by-product, Heilbroner wrote, and so, if industry keeps growing, it will eventually generate so much heat the earth will cook. Industrial activity would have to stop or all human life-perhaps all life-would be consumed. In any event, "one irrefutable conclusion remains. The industrial growth process, so central to the economic and social life of capitalism and Western socialism alike, will be forced to slow down, in all likelihood within a generation or two, and will probably have to give way to decline thereafter."
So is humanity doomed? Not at all, Heilbroner a.s.sured his readers. "The human prospect is not a death sentence. It is not an inevitable doomsday to which we are headed, although the risk of enormous catastrophes exists." It's more like "a contingent life sentence-one that will permit the continuance of human society, but only on a basis very different from the present, and probably only after much suffering in the period of transition." Unfortunately, that new basis would be brutal authoritarianism in the United States and other developed countries because only such governments would have the strength to see us through the dark days ahead. These new regimes would feature a government that "blends a 'religious' orientation with a 'military' discipline," Heilbroner wrote. The closest thing to it was China under Mao Zedong. "Such a monastic organization of society may be repugnant to us," Heilbroner wrote with some understatement, "but I suspect it offers the greatest promise of making those enormous transformations needed to reach a new stable socio-economic basis."
So that was the human prospect: Maoist China or extinction.
Even by the standards of the early 1970s, this was grim stuff from a leading intellectual, and it got a lot of attention. But what makes An Inquiry into the Human Prospect stand out today is that the book was republished in 1980 and again in 1991, and both times Heilbroner added commentaries after each that looked at what he had written in light of what had happened as the years pa.s.sed.
In his first retrospective, written in 1980, Heilbroner begins with what appears to be an admission that he was a little off base. "I am acutely aware that things do not look quite the same today as they did when I wrote this opening chapter," he began. The "atmosphere of siege" has lifted. "The rumblings of a civilizational malaise" are not heard so much. But this doesn't mean Heilbroner was wrong. It means that people are increasingly deluded about humanity's prospects-which are as awful as ever. True, Heilbroner acknowledges, the population problem has changed dramatically thanks to surprising downturns in fertility rates all over the world, but "if the cancer is now spreading less rapidly, it is still spreading." There have been "several serious famines since The Human Prospect first appeared," he notes, and more are sure to come. And while the poor nations have not yet produced authoritarian governments bent on war with the rich world, well, just wait. Nuclear weapons are even cheaper now. And poor nations with oil ("a fast-disappearing resource") could choke the life out of rich countries-although rich countries like the United States and Canada could retaliate with "food power" because "each year the Asian and African countries" import more food to keep their populations from starving.
As for the environmental predicament, Heilbroner felt in 1980 that he was just as right as he had been in 1973. "I see little need to alter the thrust of my original argument, even though, since the first edition appeared, the attention of scientists has been directed at the climate problem from a somewhat different perspective than its long-term heating-up from the release of combustion energy," he wrote. "The emphasis today is on a short-term effect that results from the release of carbon dioxide into the atmosphere as a by-product of combustion. There, the CO2 forms an invisible 'pane' of gas that acts like the gla.s.s in a greenhouse, trapping the reflected rays of the sun, and warming the atmosphere just like the air in a greenhouse." Note what Heilbroner has done here: In 1973, he predicted that the heat produced as a by-product of industrial activity would warm the atmosphere, putting the very existence of life in jeopardy. That was one of many climate-change hypotheses making the rounds at the time, but by 1980, most scientists had decided it was wrong. And yet, Heilbroner didn't conclude that his prediction had been even a little wrong. Instead, he acknowledged "a somewhat different perspective"-as he describes the rise of a completely different theory that happens to lead to a similar conclusion. It's as if someone predicted Muslim insurrections would cause the collapse of the Soviet Union and then boasted when the Soviet Union collapsed for reasons that had absolutely nothing to do with Muslims.
Heilbroner's 1991 commentaries are even more interesting because, by then, major changes had happened. Resource prices had fallen and the world was awash in cheap oil. The Green Revolution had produced huge increases in food in the developing world. Equally significant was what had not happened in the almost two decades since An Inquiry into the Human Prospect was first published. There was no war between the poor world and the rich. Nuclear weapons had not proliferated. Authoritarianism had not flourished. In fact, it was clear that freedom, not authoritarianism, was on the rise. In 1973, 46 percent of countries in the world were rated "not free" by Freedom House, the internationally respected NGO; 25 percent were "partly free"; 29 percent were rated "free." By 1980, when Heilbroner wrote his first retrospective, 35 percent were not free, 33 percent were partly free, and 32 percent were free. By 1991, the Iron Curtain had been raised and the percentage of "not free" countries had fallen by half to 23 percent; 35 percent were partly free; free countries were now the largest category at 42 percent. History had delivered exactly the opposite of what Heilbroner had expected.
Still, Heilbroner saw nothing significantly wrong with his 1973 forecast. "Specific predictions, estimates, and measurements have all changed, but only marginally. The basic a.s.sessment remains. With them also remains the demanding, uncomfortable, despairing-but not defeatist-prospect for humanity." How could Heilbroner draw that conclusion in 1991? By writing nothing about oil and resources, food supply, the fate of freedom, or other subjects that were critical to the original forecast. Thus he was spared unpleasant encounters with facts that had changed in ways that contradicted his beliefs.
Heilbroner did note the fall of the Soviet bloc, however. He had to. It was 1991, the collapse of the East Bloc had already happened, and the final crumbling of the USSR was taking place as he wrote. This was a problem, because Heilbroner had argued that both capitalist and socialist systems would be strained by environmental pressures, and the latter had a better chance of coping. "That conclusion has been dramatically proved wrong," he wrote, at least "for the form of socialism known as communism." This was admirably forthright, but Heilbroner quickly added that it didn't really change anything. "The great challenge affecting all socioeconomic orders in the twenty-first century remains the approach of ecological danger," he wrote. "Twenty years have not affected that outlook one iota."
Of course, he may still be right about that last statement. But it's hard to escape the conclusion that for Robert Heilbroner the human prospect was grim, and it would remain so no matter what happened.
LORD WILLIAM REES-MOGG.
"We got some things wrong but we got enough right to justify writing them," Lord William Rees-Mogg, the former editor of The Times of London, says about the three thick volumes of prognostication he and James Dale Davidson, an American investment adviser, published between 1987 and 1997. This is British understatement. Looking back in 2009, Rees-Mogg thinks he and Davidson nailed pretty much all the important trends and events of our time. "We got basically the decline and fall of the Soviet Union right. We saw, though we got the timing wrong, the weaknesses which led to the crisis of the economic system. We even got the attack on the Twin Towers right, in that we pointed to the vulnerability of the Twin Towers as symbolic of the vulnerability of modern society." When I ask what he got wrong, Rees-Mogg acknowledges being off on timing occasionally. "We were much too early" in calling the crash of 2008, he says. And the high-tech crash of 2000: "I became very bearish about the American economy, and nervous about it, as early as the mid-nineties."
Yes, the timing. That's always the tricky part. But Rees-Mogg's view of what can and cannot be predicted is modest, he says. "The point of this kind of forecasting is to explore trends that people need to look out for. You obviously can't both foresee the future and foresee the timing of the future. You sometimes get it right. You sometimes get it wrong. But you can help people to understand the process in which they're involved, in which the world is involved, at this time." And in that sense, Rees-Mogg and Davidson did a fine job. "I think that our general view of the world fits reasonably well with what happened."
I'm sure he does believe that, but I suspect few others would agree.
The basic vision of Rees-Mogg and Davidson's first book of predictions, written in 1986, is summed up nicely in the t.i.tle: Blood in the Streets: Investment Profits in a World Gone Mad. "The best time to buy is when blood is running in the streets," the authors quote Nathan M. Rothschild as saying. Rees-Mogg and Davidson saw lots of blood and buying ahead.
Fundamental trends, particularly technological trends, are driving events, they argue. The world is rapidly approaching "an electronic feudalism, an environment in which cheap and effective high-tech weapons will give increasing numbers of disgruntled groups a veto over almost any activity they do not like." When every thug and terrorist can buy cheap missiles, the power of national military forces will be dramatically weakened. The United States will suffer worst. "The sun is setting now on the American Empire, as it once set on the British Empire. As it does, shadows fall on formerly safe streets everywhere. The political equivalent of youth gangs, petty local leaders, are reaching for their guns. They will make the rules now. Rules that are enemies of progress. Rules that are a way of saying 'nothing may pa.s.s this way without my say-so.' Rules that tax or inhibit trade. Rules that usurp and confiscate investment-the way street bullies take whatever they can get away with."
International trade will give way to protectionism, real estate values will plunge, and stock markets will crash. Debt default will sweep the world. Nations will splinter and crumble. How bad will it get? There are obvious parallels with the Great Depression, the authors wrote, but it's much worse than that. "We suspect that the forms of the nation-state would remain, as in Lebanon, as indeed, the form of the old Roman Empire was preserved, like an unburied mummy, throughout the Middle Ages. We could be slowly entering a period as violent and murky as the feudalism of old." And the transformation of the world is already well under way. "Economies, even in Europe and the United States, have begun falling into their foundations, like old houses with rotten beams."
In 1991, Rees-Mogg and Davidson published The Great Reckoning: How the World Will Change in the Depression of the 1990s. Although their earlier predictions had been "considered improbable or even ridiculous," the authors wrote, subsequent events had proved them right. Stock markets plunged in October 1987, mere months after the publication of Blood in the Streets. Communism collapsed in the Soviet Union and Eastern Europe and was on the verge of death in China, just as they had predicted. The "multi-ethnic empire" the Soviets had inherited from Czarist Russia was crumbling right on schedule. And so much else was bang on. "We predicted the increase in protectionism, intensified terrorism, and more." True, humanity hadn't plunged into a global depression, the authors acknowledged. But just wait. The signs were everywhere.
Having established their prophetic prowess, Rees-Mogg and Davidson repeated and expanded the gloomy prognosis of Blood in the Streets. "We expect the 1990s to be a decade of escalating economic and political disorder unparalleled since the 1930s." Economies around the world will experience a "deflationary collapse" and the United States will not be able to bail anyone out because the American economy will itself be crippled. "The simultaneous decline in the relative power of both the United States and the Soviet Union is bound to be fundamentally destabilizing, economically as well as politically." At a time when Third World dictators are getting nuclear weapons and using terrorists to "deliver them by overnight express," the world will be without a leader. "Smaller and smaller groups will gain military effectiveness. Violence will tend to rea.s.sert itself as the common condition of life." Disintegration will sweep the globe. "We will see the breakup not just of the Soviet Union . . . but Canada, China, Yugoslavia, Ethiopia, and other countries." East Germany will become democratic in the newly unified Germany but "much of Eastern Europe, like the Soviet Union itself, has a future much like the past of Latin America"-meaning rule by uniformed thugs. Eventually, in the twenty-first century, international order may be restored by the rise of a new global policeman. But it won't be the United States. "If there is to be a new hegemonic power in the world, it is likely to be j.a.pan."
And the same forces of disintegration will be felt within the United States, Rees-Mogg and Davidson warn. Gangs and terrorists will become increasingly powerful as high-tech weapons proliferate. Suburbs will decay and grow dangerous while inner cities turn into war zones. Los Angeles, Chicago, Houston: "Practically every city with a large undercla.s.s is at risk." New York will suffer worst. "By the year 2000, New York could be a Gotham City without Batman."
In 1997, Rees-Mogg and Davidson returned with The Sovereign Individual: Mastering the Transition to the Information Age. After the usual list of predictions the earlier books got exactly right, Rees-Mogg and Davidson repeat their familiar themes and forecasts, along with a few new twists. Y2K will be huge. And cyberterrorism looms. But the biggest threat is the millennium itself, the authors say. They see cycles in history, and the year 2000 marks the end of the biggest. "More than 85 years after the day in 1911 when Oswald Spengler was seized with an intuition of a coming war and 'the decline of the West,' we, too, see 'a historical change of phase occurring . . . at the point preordained for it hundreds of years ago.' Like Spengler, we see the impending death of Western civilization, and with it the collapse of the world order that had predominated these past five centuries, ever since Columbus sailed west to open contact with the New World. Yet unlike Spengler we see the birth of a new stage in Western civilization in the coming millennium."
The year 2000 did see the death of many dot-com stocks. But Western civilization? That's a stretch, as is Rees-Mogg's positive a.s.sessment of these forecasts.
In the twenty years that followed Blood in the Streets, protectionism did not dominate. Globalization did. Trade barriers fell to historic lows and the economies of many developing countries-notably China, India, and Brazil-grew faster than ever. The Soviet Union and Yugoslavia did break up-Yugoslavia was already falling apart when Rees-Mogg and Davidson put it on the endangered list-but disintegration did not become a global trend. In fact, the European Union expanded rapidly into Eastern Europe, bringing the continent a degree of unity and order it had not enjoyed since the glory days of the Roman Empire.
As for the United States, the general trend was up. The latter half of the 1990s, in particular, saw a roaring economy, soaring stocks, and swelling federal government surpluses. With the greatest of ease, the United States bailed out Mexico and other foreign nations that suffered economic crises. And far from turning into urban wastelands, American cities enjoyed a renaissance; New York, in particular, was transformed for the better. At the same time, American geopolitical power grew so dramatically that, by the mid-1990s, a French minister famously complained that the United States had moved beyond superpowerdom and become a globe-dominating "hyperpower." International security improved as well. Instead of the rising bloodshed Rees-Mogg and Davidson predicted, the long-term rise in democracy continued, interstate wars became increasingly rare, civil wars became less common, and all wars became less deadly. There was also a steep decline in the number of genocides and ma.s.s killings.
The United States and the world took a big hit in 2008, of course. But even in the worst days of that crisis, neither America nor the world looked like the dystopian horror forecast by Rees-Mogg and Davidson. More crucially, Blood in the Streets and the other books made it clear that the disaster they predicted was imminent, not decades off.
As for terrorism, it continued to plague humanity in the years after Rees-Mogg and Davidson published their books, as it always has. But it did not escalate in frequency: The annual number of international terrorist attacks, which had been rising since the 1960s, peaked in 1991 at 450 incidents; it then fell steadily and substantially until 2000, when there were 100 incidents. Nor did terrorism go nuclear or high tech, as Rees-Mogg and Davidson had predicted. The 9/11 attacks were carried out by nineteen men armed with nothing more sophisticated than box cutters and plane tickets.
Rees-Mogg's claim to have predicted the breakup of the Soviet Union is only slightly less tenuous. The pair's whole argument was that technological change was empowering small groups and individuals and that this would fracture nations everywhere, the Soviet Union being only one. As their larger point is clearly wrong, it's not unreasonable to think it was only luck that they called the fall of the Soviet Union correctly. After all, a few hits are to be expected when you publish three thick books stuffed with predictions-a simple truth demonstrated nicely by British astrologers whose collective forecast of what would happen in the 1980s was wrong about almost everything except the a.s.sa.s.sination of Egyptian president Anwar Sadat.
But even if Rees-Mogg and Davidson are credited with predicting the collapse of the USSR, would an impartial observer agree with Rees-Mogg that his books' "general view of the world fits reasonably well with what happened"? I doubt it. And yet I don't doubt that Rees-Mogg is sincere in his a.s.sessment because he very much wants it to be true, and a yearning mind is a creative mind. Unmistakable failures are forgotten. Other memories are altered subtly, so they appear less wrong or almost right. Rationalization and confirmation bias multiply the hits. And time frames are stretched and stretched until finally something that looks even slightly like the prediction appears and the desperate prophet is able to shrug and say, "I was only off on timing."
That's how cognitive dissonance works.
PAUL EHRLICH.
The sentence by which The Population Bomb should be judged is not the famous introduction: "The battle to feed all of humanity is over." Nor is it the frightening elaboration in the second sentence: "In the 1970s the world will undergo famines-hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now." No, the sentence that makes all the difference is the third sentence of the book: "At this late date nothing can prevent a substantial increase in the world death rate, although many lives could be saved through dramatic programs to 'stretch' the carrying capacity of the earth by increasing food production."
Paul Ehrlich's a.n.a.lysis of the world situation in the late 1960s and early 1970s boiled down to something very simple: The birthrate is far higher than the death rate, and as a result the population is soaring. This often happens in nature. Rabbits, mice, deer, and lots of other species multiply rapidly when the environment allows them to. But it always ends badly. The population grows and grows until it exceeds the capacity of the environment to sustain it. Then it is cut down by starvation, disease, and predators. The swelling human population has reached that horrible moment, Ehrlich believed. People will starve. Disease will flourish. Instability and war will grow. Combined, this will produce "a substantial increase in the world death rate."
Of course, there is an alternative to these sorts of tragedy. A "birthrate solution," as Ehrlich called it, would see the birthrate lowered to equal the death rate and stop the population from growing. But by 1968, with more than three billion people on the planet, it was too late. The population is already so big, Ehrlich argued, and the margin between global food demand and global food production is so thin, that not even swift and determined action on the birthrate will stop the calamity from striking in the 1970s. It is, as Ehrlich wrote a year later, "the most dramatic crisis h.o.m.o sapiens has ever faced."
This means checking on the accuracy of Ehrlich's forecast is remarkably easy. What happened to the death rate? As for the time frame, that's clear too. Ehrlich mentioned "the 1970s" over and over. He also wrote that "the next nine years will probably tell the story." At one point in The Population Bomb, he did, however, allow that the catastrophe could be delayed a little, into the 1980s. And in later editions of The Population Bomb, he changed the second sentence of the book to say that there would be famines in "the 1970s and 1980s." A generous observer might say those two decades are the time frame of his prediction.
So what happened? According to the definitive source-the United Nations' world Demographic Yearbook-the world death rate in the period 1965 to 1974 was thirteen per thousand population. In the period 1980 to 1985, it was eleven. In 1985 to 1990, it was ten. So there was no "substantial increase in the world death rate," as Ehrlich predicted. There was instead a substantial decrease. That trend continued, incidentally. In the period 2005 to 2010, the world death rate was nine.
Paul Ehrlich made a precise, measurable prediction. And it failed. In fact, what happened was the opposite of what he said would happen. Does Ehrlich acknowledge this? Not in the least. In two lengthy interviews, Ehrlich admitted making not a single major error in the popular works he published in the late 1960s and early 1970s.
To do this, Ehrlich has to ignore death rates. Not once in our interviews did he mention them, nor do they appear in the many other published interviews I've found. There is, however, a brief reference to death rates in a retrospective he and his wife, Anne, published in 2009: "We are often asked what happened to the famines The Bomb predicted, as if the last four decades were a period of abundant food for all," the Ehrlichs wrote. "But, of course, there were famines, essentially continuously in parts of Africa. Perhaps 300 million people overall have died of hunger and hunger-related diseases since 1968. But the famines were smaller than our reading of the agricultural literature at the time led us to antic.i.p.ate. What happened? The central factor, of course, was the medium-term success of the 'Green Revolution' in expanding food production at a rate beyond what many, if not most, agricultural experts believed likely. As a result, there wasn't a general rise in the death rate from hunger-although there have been periodic regional rises in South Asia and Africa, and the world may now be on the brink of another major crisis." Forced to confront the blatant gap between his prediction and what happened, Ehrlich downplays what he actually wrote, fudges the evidence, and erases the time frame-suggesting he is about to be proved right any day now.
I asked Ehrlich to summarize his predictive record for me. He responded: "The main things, of course, that I predicted was that there was going to be continued hunger, that we would not solve the hunger problem in the foreseeable future, and since there are more hungry people now than when I made the prediction, that seems like a valid one. I predicted that if we were to keep putting c.r.a.p in the atmosphere, we were going to get climate change, although at that time we didn't know if there would be heating or cooling. . . . I was too pessimistic about the speed with which the Green Revolution would spread. It did reduce the large-scale famines and more or less transform the whole thing into hunger spread around the world rather than famines as concentrated as I would have expected. I shouldn't say 'as I would have expected' [but] 'as the agricultural scientists I talked to expected' because it's not my area of expertise. I totally missed the problems with the destruction of the tropical forests. I did predict that there were going to be novel diseases, and of course we've had AIDS since, so that was an accurate prediction." Notice that the only flat-out mistake Ehrlich acknowledges is missing the destruction of the rain forests, which happens to be a point that supports and strengthens his worldview-and is therefore, in cognitive dissonance terms, not a mistake at all. Beyond that, he was, by his account, off a little here and there, but only because the information he got from others was wrong. Basically, he was right across the board.
The specific claims Ehrlich makes in his summary are all dubious, as careful examination reveals. To take just one example, Ehrlich did not predict "that there was going to be continued hunger." He predicted there would be ma.s.sive famines over and above existing levels of hunger, which is why the death rate would rise. So while it may be superficially impressive that The Population Bomb predicted "hundreds of millions" would starve to death and "300 million people" died of hunger since 1968, it's very misleading. For one thing, the book's time frame was ten or twenty years, not forty. For another, Ehrlich upped the predicted death toll to "a billion or more" in the 1974 book The End of Affluence. But most importantly, the "300 million" figure doesn't show an increase in hunger and starvation. It only shows that hunger and starvation weren't eliminated.
The more telling problem with Ehrlich's summary of his record is what's not there. There's the end of affluence, for one. It was a key theme in all his writing. He even published a book with that t.i.tle. "In our opinion, the last decades of the twentieth century will initiate a worldwide age of scarcity," Ehrlich wrote in 1974. "There will be no more cheap, abundant energy, no more cheap, abundant food, and soon the flow of cheap consumer goods will suffer increasing disruption and rising prices." It would be the most difficult period ever faced by industrial society, Ehrlich predicted. Society might crash.
Another gap in his summary is the fate of foreign countries, including the supposedly inevitable collapse of India, the decline of the United Kingdom into poverty and hunger, and many other dire forecasts. "We are facing, within the next three decades, the disintegration of an unstable world of nation-states infected with growthmania," he wrote in The End of Affluence. Again, he made no mention of any of this. When I asked Ehrlich if he recalled any predictions he had made about particular foreign countries, he said he could not.
Like the experts in Tetlock's experiment who were sure they had forecast the fall of the Soviet Union, hindsight bias and cognitive dissonance-and the distortion, rationalization, and forgetting they produce-have led Paul Ehrlich to the happy conclusion that his predictions were essentially correct. And it is very unlikely anything could convince him otherwise. The famous Simon-Ehrlich bet is proof of that.
In the 1970s, Julian Simon, who died in 1998, was a business professor at the University of Maryland who became tired of the constant gloomy talk from the likes of the Club of Rome and Paul Ehrlich. People are endlessly inventive, Simon argued. If we start to run out of one thing, we'll find more efficient methods of using it or subst.i.tute something else. And since human ingenuity is the ultimate source of wealth, the more people there are, the better off we'll be. Ten billion? No problem. Twenty billion? All the better. Simon's beliefs may have been radically unlike Ehrlich's, but he shared with Ehrlich a fondness for sweeping, confident statements about the future. As Ehrlich's fame grew, Julian Simon became the Anti-Ehrlich.
In 1980, Simon attacked the doomsters in the august pages of Science. The journal was deluged with angry letters. Ehrlich was especially livid. He and Simon exchanged furious claims and counterclaims, with each man accusing the other of ignoring his arguments and distorting the evidence. On and on it went until, finally, there came the bet.
Simon believed that human ingenuity would steadily drive the price of resources downward, contrary to the growing scarcity and rising prices expected by Malthusians like Ehrlich. So he offered to bet anyone that the price of any resource over any time would fall. Ehrlich announced he would "accept Simon's astonis.h.i.+ng offer before other greedy people jump in." Together with John Holdren and two other scientists, Ehrlich agreed that he and Simon would buy a thousand dollars' worth of five key metals-copper, tin, chrome, nickel, and tungsten-in 1980. They would be sold in 1990. If the sale price, in 1980 dollars, were higher, Ehrlich would win and reap the profit. If the proceeds were lower, Simon would win and Ehrlich would pay him the difference.
Ten years pa.s.sed, and the metals were worth considerably less in 1990 than they had been in 1980. Simon won. Ehrlich sent a check to his rival.
Ehrlich insists to this day that the bet proved nothing. There was a recession "in the first half" of the 1980s that "slowed the growth of demand for industrial metals worldwide," he wrote. "Ironically, a prominent reason for the slower industrial growth was the doubling of world oil prices in 1979. Indeed, the price of oil probably was a factor in the prices of metals in both years, being unprecedentedly high in 1980 and unprecedentedly low in 1990." So Simon wasn't right. He was lucky. Whether there's something to this or not-I don't find it convincing-what's more interesting about this defense is that, in the 1970s, Ehrlich stated repeatedly that oil scarcity had arrived and the price of oil would only go up. So his rationale for dismissing the bet is based on something he predicted would not happen. It's also interesting that in the decade after the bet, oil stayed cheap, but the price of metals, and most other commodities, continued to fall. If Simon and Ehrlich had renewed their bet in 1990, Ehrlich would have lost again.
In 1995, Julian Simon issued another challenge, as he often did. "Any trend pertaining to material human welfare" will improve in future, Simon claimed, and he'd put money on it. Ehrlich and Stephen Schneider, a climatologist at Stanford University, said they'd take Simon's bet based on fifteen indicators, including emissions of sulfur dioxide in Asia, the amount of fertile cropland per person in the world, carbon dioxide levels in the atmosphere, the harvest per person from oceanic fisheries, and so on. Simon refused. He said he'd accept a bet only on direct measures of human welfare, such as life expectancy and purchasing power. It was another demonstration that Simon, like Ehrlich, sometimes used extreme rhetoric that could be embarra.s.sing if someone called him on it.
Another way the two men resembled each other is that they were both bad at predicting the future. In the two editions of Simon's book The Ultimate Resource, published in 1981 and 1996, he wrote the following: "This is a public offer to stake $10,000, in separate transactions of $1,000 or $100 each, on my belief that mineral resources (or food or other commodities) will not rise in price in future years, adjusted for inflation. You choose any mineral or other raw material (including grain and fossil fuels) that is not government controlled, and the date of settlement." This was extremely bold language. Simon wasn't merely saying that, in general, over long periods of time, resources would tend to get cheaper. He said any resource, over any time period. Still, it held up in the 1980s, and again in the 1990s. But in the first decade of the twenty-first century, the price of oil, food, most metals, and other commodities broke with the trend of the previous twenty years and soared upward. That couldn't happen, according to Simon. But it did.
THE FANS.
A final way in which Paul Ehrlich and Julian Simon resemble each other is the devotion of their many admirers. Remember that the expert who makes a prediction isn't the only one committed to it. Particularly when predictions involve big issues, they can become a fundamental part of a person's understanding of reality. They can even inspire life-changing actions. In the 1970s, some Americans who believed Paul Ehrlich's predictions had only one child, or none, as he advised. That's a big commitment based on a strong belief, and one does not simply shrug and walk away from a conviction like that. "Suppose an individual believes something with his whole heart," Leon Festinger wrote a decade before Paul Ehrlich published The Population Bomb. "Suppose further that he has a commitment to this belief, that he has taken irrevocable actions because of it; finally, suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong; what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before."
"Ehrlich has in fact been right about all of his predictions, except in terms of where and when they would occur," wrote one blogger in 2009. Comments like this litter the Internet. Like Ehrlich himself, his fans minimize, evade, and rationalize, avoiding the obvious evidence that Ehrlich's predictions failed. A common dodge is that the alarm raised by The Population Bomb spurred people into action and it was those actions that forestalled the predicted disasters. "If Mr. Ehrlich hadn't written The Population Bomb in 1968," a letter to a Las Vegas newspaper argued in 2010, "would the technology to increase crop yields ever have been developed? Would millions of people all over the world have made the decision to limit the size of their families?" A few minutes with Google would have revealed to Ehrlich's defender that the Green Revolution began with the work of Norman Borlaug and colleagues in 1944, that it was delivering huge increases in agricultural yields years before The Population Bomb was published, and that Ehrlich repeatedly scorned those who claimed the Green Revolution or any other technological fix could forestall the coming famines. He would also have discovered that fertility rates in the developed world had, with the exception of the postwar baby boom, been on a downward trend since the late nineteenth century, while even more dramatic declines in fertility rates happened in a long list of developing countries that have never heard of Paul Ehrlich or his book. But he didn't spend those few minutes with Google because he already had what he wanted-a rationalization that soothes cognitive dissonance the way aspirin quiets a headache.
The determination of fans to deny their idols make mistakes can sometimes even exceed that of the idols themselves. After James Howard Kunstler sent me his a.s.sessment of his Y2K forecast, he posted it to his Web site and was praised by his admirers. "Bravo on your Y2K piece," wrote one. "If anything, you've understated your case," wrote another fan, in what must be the first recorded instance of someone accusing James Howard Kunstler of understatement.
Sometimes it seems absolutely nothing can dismay believers. It may have been more than ninety years since Oswald Spengler wrote The Decline of the West, and the last sixty of those years may bear no resemblance to the future forecast by the glum German, but still, on Wikipedia, one finds the earnest claim that "no significant prediction of Spengler's in The Decline of the West has yet been refuted by events." An a.s.sertion that cannot be falsified by any conceivable evidence is nothing more than dogma. It can't be debated. It can't be proven or disproven. It's just something people choose to believe or not for reasons that have nothing to do with fact and logic. And dogma is what predictions become when experts and their followers go to ridiculous lengths to dismiss clear evidence that they failed.
A few years ago, I was standing near Red Square, in Moscow, on the day that used to mark the October Revolution, and I watched as a group of sullen old Communists milled about, taking swigs from brown paper bags. Now and then, someone would shake a bony fist at bored police officers and shout a slogan of some sort. "What are they saying?" I asked my translator. "'The victory of socialism is a.s.sured,'" she said.
Right, I thought. Marx was only off on timing. Keep the faith, comrades.
8.
The End Is Nigh.
The Commanding General is well aware the forecasts are no good. However, he needs them for planning purposes.
-KENNETH ARROW, n.o.bEL LAUREATE ECONOMIST, RECALLING THE RESPONSE HE AND COLLEAGUES RECEIVED DURING THE SECOND WORLD WAR WHEN THEY DEMONSTRATED THAT THE MILITARY'S LONG-TERM WEATHER FORECASTS WERE USELESS In 1906, when my grandfather was born in that English village, the people around him may have thought he was a lucky baby with a golden future ahead. But it's tempting to think we would have known better.
After all, by 1906, Britain did not dominate all compet.i.tors as it had thirty years before. Many nations were rapidly industrializing, including the United States, and Germany was not only challenging Britain for the economic and scientific lead, it threatened to dominate Europe politically and militarily. Even the German navy was growing rapidly, forcing Britain to engage in an expensive s.h.i.+pbuilding race if it hoped to maintain the supremacy of the Royal Navy on which Britain and its far-flung empire depended. There may have been many experts who said there would never be a war between the Great Powers, but there were some who were sure war was coming. And others feared that if there were a war, new technologies-explosives, machine guns, submarines, airplanes-would make it more terrible than any in the past.
Couldn't an intelligent person have put the pieces together and seen the disaster ahead? And if the scope and horror of the First World War had been predicted, surely it would have been obvious that Europe would wane and the two powers on the periphery of the Western world-the United States and Russia-would come to dominate the twentieth century? Some think so. "If the forecasting on technology had been combined with the forecasting on geopolitics, the shattering of Europe might well have been predicted," wrote the political scientist George Friedman in his 2009 book The Next 100 Years. "So, standing at the beginning of the twentieth century, it would have been possible to forecast its general outlines, with discipline and some luck."
And if that was possible at the beginning of the twentieth century, is it not possible to do the same at the beginning of the twenty-first? Friedman insists it is. His book is an attempt at just that.
But what Friedman doesn't tell his readers is that for every one correct prediction at the end of the nineteenth century and beginning of the twentieth (explosives and machine guns will make war deadlier, for example) there were countless others that were dead wrong (the machine gun, the airplane, radio, and the submarine were all said to make war impossible). Without benefit of hindsight, how does one separate the few valuable nuggets from the mountain of fool's gold? Friedman does not say. And a.s.suming it could be done, by what craft are the nuggets forged into a map of the coming century? Friedman is vague on that too. And did anyone actually do what Friedman confidently a.s.serts could have been done? He mentions no one. That may be because, as far as I can determine, no one did. Brilliant people like John Bates Clark tried. But read today, their efforts are downright funny. The clock's still ticking on F. E. Smith's century-long look forward in The World in 2030, but I'm reasonably confident that in 2030, India will not be a loyal daughter of the British Empire, as forecast, and the only people citing Smith's book will be historians interested in the Britain of 1930 and critics of future babble like me. You may call that a prediction if you wish.
But no matter how many times the best and brightest fail, more try. And they never doubt they are right. In his best-selling 2009 book Why Your World Is About to Get a Whole Lot Smaller, economist Jeff Rubin expresses serene confidence that the price of oil is only going up and he tells us what this will mean for the world and how we live-all without the slightest concern for the fact that very smart people have tried and failed to predict the future of oil prices for more than a century. The French intellectual Jacques Attali is a more egregious case. In 2009, almost twenty years after Attali published Millennium-and long after it was clear that essentially nothing predicted in Millennium would come to pa.s.s-he published A Brief History of the Future: A Brave and Controversial Look at the Twenty-first Century. George Friedman, too, is undaunted by the failure of past forecasts. Among them is his 1992 book The Coming War with j.a.pan, which concluded that j.a.pan would "once more seek to become an empire of its own, dominating the western Pacific and eastern Asia," and this would lead either to "a long, miserable cold war" with the United States or to a rematch of the Second World War.
Experts who cannot learn from their own stumbles are beyond help, but it is possible for the rest of us-experts and laypeople alike-to adopt a considered, reasonable skepticism about predictions. Pundits "forecast not because they know," wrote economist John Kenneth Galbraith, "but because they are asked." We could stop asking. But if we insist on asking-and we probably will, unfortunately-we must at least think carefully about what we are told. Typically, when an expert presents a prediction in the form of a confident, well-told story that resonates with our values and our sense of how the world works, we don't think to ask whether the expert's past predictions give us any reason to think he's right this time. His story resonates, and that's good enough. The prediction is praised as "plausible" and "compelling." It garners readers, audiences, and sales. The expert who made it is invited here and there to repeat it, and he or she is asked to provide other plausible and compelling insights. Time pa.s.ses, and, eventually, the accuracy of the prediction is revealed. If it's a miss, it is quietly forgotten and never discussed again. If it's a hit, it is celebrated-even if a little thought would reveal the alleged hit is very likely nothing more than a coincidence. Take H. G. Wells's 1933 prediction of a Second World War, which I mentioned at the beginning of the book. It's often said Wells was startlingly accurate; he was even right when he foresaw the war starting in a dispute between Germany and Poland over the German city of Danzig (which had been cut off from the rest of Germany when Poland was created after the First World War). So here is the father of futurology nailing the biggest event of the twentieth century-and demonstrating that grand-scale prediction is possible. But I'm afraid this is credulity on stilts. For one thing, the n.a.z.is and other German nationalists were fiercely militaristic and they routinely called for a violent revision of the international order created at the end of the First World War. They were particularly incensed that Danzig had been cut off from Germany. The danger was obvious to anyone who read a newspaper, and in Britain, as historian Richard Overy observed, "guessing when war would come became a morbid parlor-game of the 1930s." Wells was also far from alone in picking 1940 as the start date. So did John Maynard Keynes. So, too, did an English preacher who thought Adolf Hitler was the Antichrist and the war would mark the beginning of the events prophesied in the Book of Revelation. And remember that Wells's prediction is found in The Shape of Things to Come-a book whose depiction of the course and conclusion of the war does not resemble what happened and which goes on to describe a whole century of future history that looks nothing like what unfolded. Thus Wells's dazzling prediction was, in fact, a ba.n.a.l observation combined with a lucky call on timing.
And yet, I must admit, skepticism about prediction is not easy to accept. One big reason is that it feels wrong, thanks to hindsight bias. Remember: Once we know an outcome, we judge the likelihood of that outcome to be higher than we would if we did not know it. We even adjust our memories to suit this bias, and, as Philip Tetlock showed, hindsight bias can transform a judgment of "this is very unlikely to happen" into a memory of "I said it was very likely to happen." This distortion has an especially unfortunate consequence: By giving us the sense that we could have predicted what is now the present, or even that we actually did predict it when we did not, it strongly suggests that we can predict the future. This is an illusion, and yet it seems only logical-which makes it a particularly persuasive illusion.
Hindsight bias is also responsible for an unfortunate misperception that can be heard almost any time someone frets about the future. "Things are uncertain," someone says. "Not like it was in the past." The first part of that statement is accurate; the second is not. The future is always uncertain, whether it is the future we face right now or the future people faced a century ago. But when we look back, hindsight bias causes us to see much less uncertainty than we do today. We remember the 1990s, for example, as the decade of soaring stocks and not much else. It was, as it's often called, a "holiday from history." We don't think of the 1990s as the decade in which we worried that j.a.pan would swallow the United States, that there would be ma.s.sive casualties in the Gulf War of 1991, that nuclear weapons from the crumbling Soviet Union would fall into the hands of terrorists, that the American government would be bankrupted, that the Asian meltdown of 1997 would lead to a global depression, or that Y2K would cripple infrastructure and leave us eating baked beans from the can in a candlelit bas.e.m.e.nt. None of those things happened and we don't remember ourselves worrying so much that they would. In fact, lots of us did worry-the record is clear. But because of hindsight bias, we don't remember the uncertainty to the extent that we felt it at the time, and so we think that the future we face now is much more uncertain than the future we faced in the past.
The profound perceptual distortion caused by hindsight bias can make almost anyone nostalgic for the good old days. At the end of his life, my grandfather looked ahead at the world I would live in and he saw nothing but uncertainty and darkness. So many things could go so wrong. He worried for me. But his life? It had been sweet. This man who had seen his family's prosperity swept away, who had lived through two world wars, who had raised a young family in the worst depression in history-he was convinced he had lived in the best of times. It was the future that frightened him. This illusion is pervasive and yet it is seldom recognized. In books, articles, blogs, and broadcasts, we call our time the "age of uncertainty," believing that there is something uniquely uncertain about this moment. But the phrase age of uncertainty, which has appeared in The New York Times 5,720 times, made its debut there in 1924, while uncertain times has appeared 2,810 times, starting in 1853. The same illusion inspires frantic hand-waving about the supposedly unprecedented "roaring current of change" we are experiencing. Ecologist William Vogt expressed that sentiment perfectly when he wrote, "We live in a world of such rapid and dynamic change as man, in his hundreds of thousands of years of existence, has never known." But Vogt wrote that in 1948, a year after W.H. Auden published his famous poem The Age of Anxiety. And the phrase roaring current of change comes from Alvin Toffler's ma.s.sive best seller Future Shock, which was published in 1970. In that book, Toffler warned that the pace of change in modern life was so torrid it was causing a crippling mental condition he dubbed "future shock." To historians, future shock sounded an awful lot like neurasthenia-a diagnosis that was popular for several decades after the American physician George Beard invented it in 1869. Plus ca change, as the French say.
Granted, the level of uncertainty and the pace of change do vary from place to place and time to time, and there may well be good reasons to judge one moment more uncertain or unsettled than another. But our perception of this moment in this place seldom comes from a careful and rational a.s.sessment of the facts. It's a permanent condition: We always feel the uncertainty and change we are experiencing now is greater than ever. That feeling makes us hunger for predictions about the future. Particularly rapid social change like that experienced during the 1970s adds to the hunger; shocks like the oil embargo of 1973 or the crash of 2008 multiply it.
And then there's the experience of having a baby. There's nothing a new mother or father wants to know more urgently than what the world will be like for their child. In a 1975 episode of All in the Family-the television show that perfectly captured American life in a difficult decade-Mike and Gloria learn they will have their first child. At first, they're thrilled. But then Mike starts worrying about "world problems." The air and water are "poisoned," the planet is overcrowded, "and there's not enough food to go around." What would they be bringing a child into? He couldn't really answer that question but he wanted to, desperately. And so he did: The future would be awful. He was sure of it. And his moment of joy was lost.
Whether sunny or bleak, convictions about the future satisfy the hunger for certainty. We want to believe. And so we do.
I PREDICT YOU WILL OBJECT.
But still, one may protest, prediction is possible. And indispensable. "When the United States invaded Iraq, it was based on a prediction of what would happen," George Friedman responded when he was asked about claims that prediction is impossible. "When investors placed their money with Bernie Madoff, they were predicting that it was a good investment. When we cross the street we predict that the car standing at the light won't suddenly press the accelerator. All of us, in our daily lives, must make the best prediction available on every level. Prediction is built into our existence."
He's right. Almost everything we do is influenced by an explicit or implicit judgment about the future, whether that future is five seconds or five decades from now. A plan to meet someone for dinner is always based on a long list of a.s.sumptions about what will happen, and what won't, between the moment the decision is made and dinnertime. Saving for retirement is based on a prediction that the inst.i.tutions you entrust with your money will not collapse, that the investments will generate a satisfactory return, and that you will not die before you get your gold watch and clean out your desk. Our collective decisions, too, are based on predictions. For example, if we support raising gas taxes to encourage conservation, that support is based on a prediction: Do this thing and there will be a certain result in the future. So we really do predict all the time. It's an essential a