A Critique of Ray Kurzweil's Predictions
Ray Kurzweil recently announced his year-by-year predictions of the future. Here are just a few samples (full list here):
2020 – Personal computers reach a computing power comparable to the human brain.
2025 – The emergence of mass-market human implants.
2031 – 3D printed human organs used in hospitals at all levels.
2041 – Internet bandwidth will be 500 million times more than today.
2045 – The earth will turn into one giant computer.
2099 – The technological singularity extends to the entire Universe.
I'm not sure exactly what this last item means, and how it fits with communication being limited to the speed of light, and whether he means the entire visible universe up to 4.5 billion light years away, or the (possibly infinite) universe beyond that, and why other advanced civilizations haven't already triggered this. But I'm sure a mind like his has thought all that through.
Here I would like to point out a more down-to-Earth shortcoming of this genre of utopian technological futurism. They assume business as normal in terms of scientific and technological progress, and generally fail to include the very real crises facing humanity in the coming years.
To sober ourselves up from Kurzweil's lofty predictions, let us consider some of these challenges and their potential impact.
At the forefront is climate change. It is happening much faster than most scenarios predicted, and given the potential for runaway climate change once the tundra thaws, we could be witnessing some devastating consequences in coming years: major crop failures and famine, extreme weather events, millions dying of heat stroke, massive migration. These and other potential impacts will send shockwaves through our already vulnerable economic and social systems.
It is assumed that the Internet will remain functional, but as cyber-weapons get more powerful, and cyber-criminals more sophisticated, it is very possible that current attacks will escalate into widespread infrastructure shutdowns. Given how totally dependent we are upon the net for commerce, banking, science, technology and almost eery other segment of society, it could be a catastrophe. Indeed, a widespread failure of the electrical grid for more than a few days would lead to a breakdown of society from which it would be difficult to recover.
The global economy is shaky, to say the least. Ever-deepening national debts, stock market bubbles and banking crises promote the likelihood of widespread global recessions and possible collapse of currencies. Not the best environment for high-tech venture capitalists.
Terrorism cannot be ignored either. Previous terrorist movements had a goal in mind—reunification of Ireland, Algerian independence—and were open to political settlement. But those fermented by Islamist movements have deeper ideological goals which cannot be satisfied through any talks or mediation. Current approaches to dealing with them only fuel the flames. They are probably here to stay in the medium term, and with their growing resourcefulness, could have unforeseen impacts on the economy and social stability of many nations.
Nuclear war, deliberate or accidental, remains a distinct probability. So do global epidemics of drug-resistant bacteria and viruses.
These are just a few of the scenarios that could derail the technological dream. The financial and social investment it requires assumes a relatively stable society. If things start falling apart, the progress Kurzweil and others foresee may well begin to splutter.
Some blindly assume that artificial intelligences far surpassing human intelligence will be able to solve our problems, and the steady march of progress will continue unabated. It is possible that advanced AI may help solve some of them, but we cannot count on it, and certainly cannot count on it resolving all of them.
Furthermore, although we may play down the likelihood of any one scenario, the chances of one or other happening remains high. If there are, for example, ten scenarios each with only a twenty percent chance of happening, then the likelihood of at least one of them occurring is ninety percent. And several of the above, particularly major climate change, have much higher likelihood than twenty percent.
Moreover, there is another factor that needs to be taken into account: the stress of accelerating development. Stress can be loosely defined as the inability to adapt to change. Many of us can feel that in our lives, the promised freedom offered by information technology seems only to have filled our lives with more things to take care of, and to do so with less time and increasing urgency, leading to increasing fatigue and burnout. At the other extreme, climate change can be seen as a consequence of accelerating development—the exponential increase in the use of fossil fuels, producing far more carbon dioxide than the atmosphere can easily dispose of—putting the climate under stress in ways that are becoming all too apparent.
The advances that Kurzweil foresees will undoubtedly continue to accelerate the rate of development. Indeed that is one of the fundamental tenets of his vision; what he calls the law of accelerating returns. As the rate of development continues to speed up at an ever-dizzying pace, the stress on all the systems involved—personal, social, economic, geo-political, environmental—will rapidly increase. And increasing stress in a system eventually leads to breakdown and collapse.
Accelerating change may not, therefore, be such a beneficial trend after all. It could well bring about our demise. (For more on this, see my essay Blind Spot: The Unforeseen End of Accelerating Change.)
We will, I suspect, see a number of Kurzweil's technological predictions coming true—although perhaps not as speedily as he envisions—but they will almost certainly be occurring in a world that is dealing with the consequences of significant economic, social and environmental disruption. How this will play out I don't know. But it would serve the likes of Kurzweil to include this level of realism in their predictions.