This is a reprint (with some minor modifications) of a post that appeared in "The Oil Drum" in 2009. It seemed to me worth reproposing it in view of my recent article on the long-term perspectives of photovoltaic energy. Note how, in 2009, I stated that "Moore's law shows no signs of abating". After seven years, that is not true anymore!
From The Oil Drum Europe 2009
by Ugo Bardi
The 1950s and 1960s were perhaps the most optimistic decades in the history of humankind. Nuclear power was soon to provide us with energy "too cheap to meter", space travel promised weekends on the moon for the whole family, and flying cars were supposed to be the future of commuting. At that time, Robert Anson Heinlein, science fiction writer, may have been the first to propose that technology was not only progressing but progressing at exponentially growing rates. In his article "Pandora's box" (Heinlein 1952), he showed the figure shown at the beginning of this text. Curve 4, with "progress" going up as an exponential function of time, is the trend that Heinlein enthusiastically proposed.
The same concept has been proposed several times after Heinlein. Robert Solow (1956) interpreted as technological progress an exponentially growing "residual" in his models of economic growth. The concept of "intelligence explosion" has been introduced by I.J. Good in 1965, that of "technological singularity" by Vernor Vinge was published in 1993, although it was expressed for the first in his novel "Marooned in real time" (serialized in Analog magazine, May-August 1986). The concept of "technological spike" was introduced for the first time by Damien Broderick in 1997 and that of "Accelerating change" by Ray Kurtzveil in 2003. In all cases, the growth of technological progress is seen as literally "spiking out" to levels that the human mind cannot understand any longer. Sometimes the term "technological singularity" is used to describe this event. The people who tend towards this view are sometimes called "extropians" or "transhumanists" and they are highly optimistic about the capability of technology to solve all of our problems.
However, over the years, we seem to have been gradually losing the faith in technology that was common in the 1950s. We are increasingly worried about resource depletion and global warming. Both factors could make it impossible to keep the industrial society functioning and could lead to its collapse. These ideas originated, too, in the 1950s when Marion King Hubbert (1956) first proposed the concept of a production peak for crude oil, later called "peak oil". The idea that resource depletion was a critical factor in the world's economy has been proposed many times, for instance with the series of studies that go under the name of "The Limits to Growth," which saw the light for the first time in 1972. Today, Hubbert's ideas are the basis of what we call the "peak oil movement". The concept is often extrapolated to "peak resources" and to "peak civilization", that could also be the result of the effects of anthropogenic global warming. The people who follow this line of thought tend to be skeptical about the capability of technology to solve these problems.
So, what will be the future, the spike or the peak? Will the peak destroy civilization, or will the spike take it to heights never experienced before? A first crucial question on this point is whether progress is really moving at exponentially growing rates. The answer seems to be no, at least if we consider technology as a whole. In most fields, we are stuck to technologies developed decades, or even centuries, ago. The performance of motor cars, for instance, is not improving exponentially, otherwise we'd expect cars to double mileage and to halve prices every so often. This is a qualitative observation that we can make by ourselves, but there have been studies that have examined such indicators of progress as the number of patents published every year (Huebner 2005). The result is that the rate of technological innovation is not increasing and that it may be actually slowing down. As discussed, for instance, by Ayres (2003) there is no factual ground for Solow's 1956 assumption that the growth of the economy is mainly generated by "progress."
Yet, there is at least one field of technology where progress is, indeed, exponentially growing. It is information technology (IT). The growth of IT can be quantified in various ways. Moore's law is well known: it says that the number of transistors (or gates) on a single chip grows exponentially. The law has been verified for several decades and the doubling time of 24 months doesn't show signs of abating. Perhaps less known is the explosive growth of information stored in electronic form. A study by the International Data Group (IDC 2008) shows that the number of bits stored increases by a factor of ten every five years. At present, we have a total of approximately 280 exabytes (billions of gigabytes) stored. It corresponds to about 45 Gigabytes per person on the planet. Then, the amount of information being transmitted over the internet is also rising at an exponential rate. According to Morgan Stanley (2008), we are transmitting more than 12 million terabytes per month. We have no quantitative data for how fast exactly the general concept of "Information Technology" is growing, but from the growth of its many subsections we can say that it is accelerating.
Surely, progress in IT needs plenty of resources and a functioning economy and both conditions could be at risk in the future. But the demise of civilization is likely to be a slow and complex affair; something that could span most of the 21st century or, at least, the first half of it. Can we keep progress in IT alive and well for that long? Probably yes or, at least, it should be possible to allocate enough energy to keep computers running. From the IDC study that I cited before, it turns out that we spend about 30 billion dollars per year in energy used by computers and about 55 billion dollars in energy costs for new servers. This estimate doesn't take into account all the energy used in data processing, but it gives us an order of magnitude for the core energy costs of the computing world. Considering that the world oil market alone is a few trillion dollars per year (depending on the vagaries of oil prices), we see that we need probably no more than a few percent the world's energy production for our computers. It is not a negligible amount, but it seems very unlikely that, facing an energy shortage, we would cut on the vital need we have for IT. Nobody should bet on the survival of SUVs in the coming years, but computers will keep working and Moore's law could stay alive and well for years, at least; perhaps decades.
The growing performance of information technology is going to change many things in the world. Eventually, it may lead to levels of "artificial intelligence" (AI) equal or superior to human intelligence. At some point, AI could reach a point where it is able to improve itself and that would take it to superhuman, even God-like, levels. Such superior intelligence is sometimes described as a sort of technological Santa Claus bringing to humans an avalanche of gadgetry that buries forever all depletion problems. Here, however, we risk making the same mistake that Heinlein made in 1950 in his "Pandora's box". At the time, space travel was seen as the main thing going on and Heinlein confused needs for possibilities predicting anti-gravity devices and the colonization of planets arriving by the year 2000. This kind of mistake is similar to what Yudkowsky (2007) calls "the giant cheesecake fallacy". That is, if you are making a cheesecake, you'll think that a better technology will help you make a bigger cheesecake.
In the present situation, our main problem seems to be energy and the cheesecake fallacy leads us into believing that we'll soon develop (or that AI will develop for us) a source of abundant and low-cost energy just because we need it. But even super-intelligent computers have to deal with the physical world. Maybe there are ways to create the perfect energy source: safe, low-cost, abundant and usable by humans for the sake of humans. But we don't know whether that is possible within the physical laws of our universe.
Besides, is a limitless energy source going to stave off collapse forever? This question has already been asked in the first edition of "the Limits to Growth" of 1972, and the results confirmed in later editions. The simulations show that if you develop a technology that solves the energy problem, population keeps increasing, and collapse is generated by the lack of food and by pollution. So, you'd need more technological breakthroughs: ways of fighting pollution and of producing more food. But, in the long run, how would you cope with the forever increasing population? Well, new breakthroughs to send people to colonize the solar system and, eventually, the whole galaxy. All that is not physically impossible, but it is an ever growing, super-giant cheesecake. Is that what we really need?
In the end, our problem with vanishing resources is not that we don't have enough gadgetry. We have a problem of management. We tend to exploit resources well above their capability to reform, that is beyond sustainability. In addition, we can't control the growth of population. This is what we call "overshoot" and it leads, in the end, to a collapse that has often be catastrophic in the history of humankind. Humans have a short range vision that brings them to discount the future at a very steep rate (Hagens 2007). It is a result of our evolutionary history: we are excellent hunters and gatherers but very poor planet managers.
So, the real question is whether advanced IT (or AI) can help us to manage better the resources we have. And, here, the answer seems to be negative, at least for the time being. There is no doubt that IT is helping us to be more efficient but, as James Kunstler said in his "The Long Emergency," efficiency is the straightest path to hell. Being more efficient is a way to exploit resources faster, and that may well accelerate the collapse of civilization.
Just think of a simple gadget as an example: a car navigator. When you are using it you are, in effect, taking orders from a computer that is smarter than you at the specific task of navigating in the streets. The navigator will make it faster and easier for you to travel by car from point "A" to point "B", but will have no say on whether it is a good idea to go from A to B. Besides, if you can save some gasoline in going from A to B by an optimized route, you may decide to use it to go further on, to point C. So, the greater efficiency resulting from the use of the navigator will produce no energy saving. This is just an example of that is called the "Jevons effect" or the "Rebound effect" which often thwarts all effort to improve things by saving energy or being more efficient.
Yet, it would not be impossible to use IT in order to fight over-exploitation, and we don't need super-human AI for that. IT can tell us where we are going and act as a "world navigator" for us; telling us how we can go from here to there, supposing that "there" is a good kind of world. The first digital computers were already used in the 1960s to simulate the whole world system (Forrester 1971). In 1972, the authors of "The Limits to Growth" used their simulations to propose ways to avoid overexploitation and keep the world's economic system on a sustainable path. These simulations could be used as a guide for steering the world's economic system in the right direction and avoid collapse. But, as we all know, policy makers and opinion leaders alike refused to take these studies seriously (the story of how "the limits to growth" book was rejected and demonized is told in my post "Cassandra's curse," Bardi 2008). So, we are still racing toward collapse; IT is just helping us to run faster in that direction.
There remains the hope that growing IT capabilities will make a difference in qualitative terms; that AI will become so powerful that it will save us from ourselves. Several years after his "Pandora's box" article, Heinlein published a novel titled "The Moon is a Harsh Mistress" (1966) where he described the birth of a human-like computer that helped a group of lunar revolutionaries to take over the local government and, eventually, became the hidden and benevolent ruler of the Lunar colony. But that, just as many predictions, might be another case of the Giant Cheesecake Fallacy: the fact that we need a technology will not necessarily make it appear and - more than that - it may not work the way we think it should. A God-like AI might not be necessarily compassionate and merciful.
In the end, both the spike and the peak are strongly non-linear phenomena, and we know that non-linear phenomena are the most difficult to predict and understand. The only thing that we can say for sure about the future is that it will be interesting. We can only hope that this will not have to be intended in the sense of the old Chinese malediction.
The Author wishes to thank Mr. Damien Broderick for pointing out some missing references in an initial version of this text.
References
Ayres, R., 2003 www.iiasa.ac.at/Research/ECS/IEW2003/Papers/2003P_Ayres.pdf
Bardi, U., 2008 "Cassandra's curse", http://europe.theoildrum.com/node/3551
Broderick, Damien, 1997 "The Spike", Reed ed.
Forrester, J.W., 1971 "World Dynamics". Wright-Allen Press.
Good, I. J., 1965. "Speculations Concerning the First Ultraintelligent Machine." Advances in Computers, Vol. 6.
Hagens, Nate, 2007 "Living for the moment while devaluing the future". http://www.theoildrum.com/node/2592
Heinlein, R.A., 1952. "Pandora's box" The article was published in the February 1952 issue of Galaxy magazine (pp. 13-22) (thanks to Damien Broderick for this information). It doesn't seem to be available on the internet but a detailed review and comments on its predictions can be found at: www.xibalba.demon.co.uk/jbr/heinlein.html. The figure at the beginning of this paper is taken from the Italian translation of the 1966 update of the paper that was published in the "Galassia" magazine.
Hubbert, M. K., !956, http://www.energybulletin.net/13630.html
Huebner, J., 2005 "A Possible Declining Trend for Worldwide Innovation," Technological Forecasting & Social Change, 72(8):988-995. See also http://accelerating.org/articles/huebnerinnovation.html]
IDC 2008, http://www.emc.com/collateral/analyst-reports/diverse-exploding-digital-...
Kurzweil R, 2003, "The Law of accelerating returns", www.kurzweilai.net/articles/art0134.html
Morgan Stanley 2008, http://www.scribd.com/doc/2683604/Internet-trends-2008
Solow, R., 1956 "A Contribution to the Theory of Economic Growth." the Quarterly Journal of Economics 70 (February 1956): 65-94. Available from www.jstor.com (Subscription required)
Yudkowsky 2007 "reasons to focus on cognitive technologies" http://www.acceleratingfuture.com/people-blog/?p=15
Vinge, V. 1993, "Technological Singularity". http://www-rohan.sdsu.edu/faculty/vinge/misc/WER2.html
From The Oil Drum Europe 2009
by Ugo Bardi
The figure above, from Robert Anson Heinlein's "Pandora's Box" (1952), is perhaps the first graphical representation of the concept that technology is not only progressing but progressing at an exponentially growing rate. Today, this concept goes sometimes under the name of the "technological spike" or the "technological singularity". However, we see also increasing concerns about peak oil and, more in general, about "peak civilization". Will the future be a spike or a peak?
The 1950s and 1960s were perhaps the most optimistic decades in the history of humankind. Nuclear power was soon to provide us with energy "too cheap to meter", space travel promised weekends on the moon for the whole family, and flying cars were supposed to be the future of commuting. At that time, Robert Anson Heinlein, science fiction writer, may have been the first to propose that technology was not only progressing but progressing at exponentially growing rates. In his article "Pandora's box" (Heinlein 1952), he showed the figure shown at the beginning of this text. Curve 4, with "progress" going up as an exponential function of time, is the trend that Heinlein enthusiastically proposed.
The same concept has been proposed several times after Heinlein. Robert Solow (1956) interpreted as technological progress an exponentially growing "residual" in his models of economic growth. The concept of "intelligence explosion" has been introduced by I.J. Good in 1965, that of "technological singularity" by Vernor Vinge was published in 1993, although it was expressed for the first in his novel "Marooned in real time" (serialized in Analog magazine, May-August 1986). The concept of "technological spike" was introduced for the first time by Damien Broderick in 1997 and that of "Accelerating change" by Ray Kurtzveil in 2003. In all cases, the growth of technological progress is seen as literally "spiking out" to levels that the human mind cannot understand any longer. Sometimes the term "technological singularity" is used to describe this event. The people who tend towards this view are sometimes called "extropians" or "transhumanists" and they are highly optimistic about the capability of technology to solve all of our problems.
However, over the years, we seem to have been gradually losing the faith in technology that was common in the 1950s. We are increasingly worried about resource depletion and global warming. Both factors could make it impossible to keep the industrial society functioning and could lead to its collapse. These ideas originated, too, in the 1950s when Marion King Hubbert (1956) first proposed the concept of a production peak for crude oil, later called "peak oil". The idea that resource depletion was a critical factor in the world's economy has been proposed many times, for instance with the series of studies that go under the name of "The Limits to Growth," which saw the light for the first time in 1972. Today, Hubbert's ideas are the basis of what we call the "peak oil movement". The concept is often extrapolated to "peak resources" and to "peak civilization", that could also be the result of the effects of anthropogenic global warming. The people who follow this line of thought tend to be skeptical about the capability of technology to solve these problems.
So, what will be the future, the spike or the peak? Will the peak destroy civilization, or will the spike take it to heights never experienced before? A first crucial question on this point is whether progress is really moving at exponentially growing rates. The answer seems to be no, at least if we consider technology as a whole. In most fields, we are stuck to technologies developed decades, or even centuries, ago. The performance of motor cars, for instance, is not improving exponentially, otherwise we'd expect cars to double mileage and to halve prices every so often. This is a qualitative observation that we can make by ourselves, but there have been studies that have examined such indicators of progress as the number of patents published every year (Huebner 2005). The result is that the rate of technological innovation is not increasing and that it may be actually slowing down. As discussed, for instance, by Ayres (2003) there is no factual ground for Solow's 1956 assumption that the growth of the economy is mainly generated by "progress."
Yet, there is at least one field of technology where progress is, indeed, exponentially growing. It is information technology (IT). The growth of IT can be quantified in various ways. Moore's law is well known: it says that the number of transistors (or gates) on a single chip grows exponentially. The law has been verified for several decades and the doubling time of 24 months doesn't show signs of abating. Perhaps less known is the explosive growth of information stored in electronic form. A study by the International Data Group (IDC 2008) shows that the number of bits stored increases by a factor of ten every five years. At present, we have a total of approximately 280 exabytes (billions of gigabytes) stored. It corresponds to about 45 Gigabytes per person on the planet. Then, the amount of information being transmitted over the internet is also rising at an exponential rate. According to Morgan Stanley (2008), we are transmitting more than 12 million terabytes per month. We have no quantitative data for how fast exactly the general concept of "Information Technology" is growing, but from the growth of its many subsections we can say that it is accelerating.
Surely, progress in IT needs plenty of resources and a functioning economy and both conditions could be at risk in the future. But the demise of civilization is likely to be a slow and complex affair; something that could span most of the 21st century or, at least, the first half of it. Can we keep progress in IT alive and well for that long? Probably yes or, at least, it should be possible to allocate enough energy to keep computers running. From the IDC study that I cited before, it turns out that we spend about 30 billion dollars per year in energy used by computers and about 55 billion dollars in energy costs for new servers. This estimate doesn't take into account all the energy used in data processing, but it gives us an order of magnitude for the core energy costs of the computing world. Considering that the world oil market alone is a few trillion dollars per year (depending on the vagaries of oil prices), we see that we need probably no more than a few percent the world's energy production for our computers. It is not a negligible amount, but it seems very unlikely that, facing an energy shortage, we would cut on the vital need we have for IT. Nobody should bet on the survival of SUVs in the coming years, but computers will keep working and Moore's law could stay alive and well for years, at least; perhaps decades.
The growing performance of information technology is going to change many things in the world. Eventually, it may lead to levels of "artificial intelligence" (AI) equal or superior to human intelligence. At some point, AI could reach a point where it is able to improve itself and that would take it to superhuman, even God-like, levels. Such superior intelligence is sometimes described as a sort of technological Santa Claus bringing to humans an avalanche of gadgetry that buries forever all depletion problems. Here, however, we risk making the same mistake that Heinlein made in 1950 in his "Pandora's box". At the time, space travel was seen as the main thing going on and Heinlein confused needs for possibilities predicting anti-gravity devices and the colonization of planets arriving by the year 2000. This kind of mistake is similar to what Yudkowsky (2007) calls "the giant cheesecake fallacy". That is, if you are making a cheesecake, you'll think that a better technology will help you make a bigger cheesecake.
In the present situation, our main problem seems to be energy and the cheesecake fallacy leads us into believing that we'll soon develop (or that AI will develop for us) a source of abundant and low-cost energy just because we need it. But even super-intelligent computers have to deal with the physical world. Maybe there are ways to create the perfect energy source: safe, low-cost, abundant and usable by humans for the sake of humans. But we don't know whether that is possible within the physical laws of our universe.
Besides, is a limitless energy source going to stave off collapse forever? This question has already been asked in the first edition of "the Limits to Growth" of 1972, and the results confirmed in later editions. The simulations show that if you develop a technology that solves the energy problem, population keeps increasing, and collapse is generated by the lack of food and by pollution. So, you'd need more technological breakthroughs: ways of fighting pollution and of producing more food. But, in the long run, how would you cope with the forever increasing population? Well, new breakthroughs to send people to colonize the solar system and, eventually, the whole galaxy. All that is not physically impossible, but it is an ever growing, super-giant cheesecake. Is that what we really need?
In the end, our problem with vanishing resources is not that we don't have enough gadgetry. We have a problem of management. We tend to exploit resources well above their capability to reform, that is beyond sustainability. In addition, we can't control the growth of population. This is what we call "overshoot" and it leads, in the end, to a collapse that has often be catastrophic in the history of humankind. Humans have a short range vision that brings them to discount the future at a very steep rate (Hagens 2007). It is a result of our evolutionary history: we are excellent hunters and gatherers but very poor planet managers.
So, the real question is whether advanced IT (or AI) can help us to manage better the resources we have. And, here, the answer seems to be negative, at least for the time being. There is no doubt that IT is helping us to be more efficient but, as James Kunstler said in his "The Long Emergency," efficiency is the straightest path to hell. Being more efficient is a way to exploit resources faster, and that may well accelerate the collapse of civilization.
Just think of a simple gadget as an example: a car navigator. When you are using it you are, in effect, taking orders from a computer that is smarter than you at the specific task of navigating in the streets. The navigator will make it faster and easier for you to travel by car from point "A" to point "B", but will have no say on whether it is a good idea to go from A to B. Besides, if you can save some gasoline in going from A to B by an optimized route, you may decide to use it to go further on, to point C. So, the greater efficiency resulting from the use of the navigator will produce no energy saving. This is just an example of that is called the "Jevons effect" or the "Rebound effect" which often thwarts all effort to improve things by saving energy or being more efficient.
Yet, it would not be impossible to use IT in order to fight over-exploitation, and we don't need super-human AI for that. IT can tell us where we are going and act as a "world navigator" for us; telling us how we can go from here to there, supposing that "there" is a good kind of world. The first digital computers were already used in the 1960s to simulate the whole world system (Forrester 1971). In 1972, the authors of "The Limits to Growth" used their simulations to propose ways to avoid overexploitation and keep the world's economic system on a sustainable path. These simulations could be used as a guide for steering the world's economic system in the right direction and avoid collapse. But, as we all know, policy makers and opinion leaders alike refused to take these studies seriously (the story of how "the limits to growth" book was rejected and demonized is told in my post "Cassandra's curse," Bardi 2008). So, we are still racing toward collapse; IT is just helping us to run faster in that direction.
There remains the hope that growing IT capabilities will make a difference in qualitative terms; that AI will become so powerful that it will save us from ourselves. Several years after his "Pandora's box" article, Heinlein published a novel titled "The Moon is a Harsh Mistress" (1966) where he described the birth of a human-like computer that helped a group of lunar revolutionaries to take over the local government and, eventually, became the hidden and benevolent ruler of the Lunar colony. But that, just as many predictions, might be another case of the Giant Cheesecake Fallacy: the fact that we need a technology will not necessarily make it appear and - more than that - it may not work the way we think it should. A God-like AI might not be necessarily compassionate and merciful.
In the end, both the spike and the peak are strongly non-linear phenomena, and we know that non-linear phenomena are the most difficult to predict and understand. The only thing that we can say for sure about the future is that it will be interesting. We can only hope that this will not have to be intended in the sense of the old Chinese malediction.
The Author wishes to thank Mr. Damien Broderick for pointing out some missing references in an initial version of this text.
References
Ayres, R., 2003 www.iiasa.ac.at/Research/ECS/IEW2003/Papers/2003P_Ayres.pdf
Bardi, U., 2008 "Cassandra's curse", http://europe.theoildrum.com/node/3551
Broderick, Damien, 1997 "The Spike", Reed ed.
Forrester, J.W., 1971 "World Dynamics". Wright-Allen Press.
Good, I. J., 1965. "Speculations Concerning the First Ultraintelligent Machine." Advances in Computers, Vol. 6.
Hagens, Nate, 2007 "Living for the moment while devaluing the future". http://www.theoildrum.com/node/2592
Heinlein, R.A., 1952. "Pandora's box" The article was published in the February 1952 issue of Galaxy magazine (pp. 13-22) (thanks to Damien Broderick for this information). It doesn't seem to be available on the internet but a detailed review and comments on its predictions can be found at: www.xibalba.demon.co.uk/jbr/heinlein.html. The figure at the beginning of this paper is taken from the Italian translation of the 1966 update of the paper that was published in the "Galassia" magazine.
Hubbert, M. K., !956, http://www.energybulletin.net/13630.html
Huebner, J., 2005 "A Possible Declining Trend for Worldwide Innovation," Technological Forecasting & Social Change, 72(8):988-995. See also http://accelerating.org/articles/huebnerinnovation.html]
IDC 2008, http://www.emc.com/collateral/analyst-reports/diverse-exploding-digital-...
Kurzweil R, 2003, "The Law of accelerating returns", www.kurzweilai.net/articles/art0134.html
Morgan Stanley 2008, http://www.scribd.com/doc/2683604/Internet-trends-2008
Solow, R., 1956 "A Contribution to the Theory of Economic Growth." the Quarterly Journal of Economics 70 (February 1956): 65-94. Available from www.jstor.com (Subscription required)
Yudkowsky 2007 "reasons to focus on cognitive technologies" http://www.acceleratingfuture.com/people-blog/?p=15
Vinge, V. 1993, "Technological Singularity". http://www-rohan.sdsu.edu/faculty/vinge/misc/WER2.html