BestLightNovel.com

The Singularity Is Near_ When Humans Transcend Biology Part 2

The Singularity Is Near_ When Humans Transcend Biology - BestLightNovel.com

You’re reading novel The Singularity Is Near_ When Humans Transcend Biology Part 2 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

Not surprisingly, the concept of complexity is complex. One concept of complexity is the minimum amount of information required to represent a process. Let's say you have a design for a system (for example, a computer program or a computer-a.s.sisted design file for a computer), which can be described by a data file containing one million bits. We could say your design has a complexity of one million bits. But suppose we notice that the one million bits actually consist of a pattern of one thousand bits that is repeated one thousand times. We could note the repet.i.tions, remove the repeated patterns, and express the entire design in just over one thousand bits, thereby reducing the size of the file by a factor of about one thousand.

The most popular data-compression techniques use similar methods of finding redundancy within information.3 But after you've compressed a data file in this way, can you be absolutely certain that there are no other rules or methods that might be discovered that would enable you to express the file in even more compact terms? For example, suppose my file was simply "pi" (3.1415...) expressed to one million bits of precision. Most data-compression programs would fail to recognize this sequence and would not compress the million bits at all, since the bits in a binary expression of pi are effectively random and thus have no repeated pattern according to all tests of randomness. But after you've compressed a data file in this way, can you be absolutely certain that there are no other rules or methods that might be discovered that would enable you to express the file in even more compact terms? For example, suppose my file was simply "pi" (3.1415...) expressed to one million bits of precision. Most data-compression programs would fail to recognize this sequence and would not compress the million bits at all, since the bits in a binary expression of pi are effectively random and thus have no repeated pattern according to all tests of randomness.

But if we can determine that the file (or a portion of the file) in fact represents pi, we can easily express it (or that portion of it) very compactly as "pi to one million bits of accuracy." Since we can never be sure that we have not overlooked some even more compact representation of an information sequence, any amount of compression sets only an upper bound for the complexity of the information. Murray Gell-Mann provides one definition of complexity along these lines. He defines the "algorithmic information content" (Ale) of a set of information as "the length of the shortest program that will cause a standard universal computer to print out the string of bits and then halt."4 However, Gell-Mann's concept is not fully adequate. If we have a file with random information, it cannot be compressed. That observation is, in fact, a key criterion for determining if a sequence of numbers is truly random. However, if any random sequence will do for a particular design, then this information can be characterized by a simple instruction, such as "put random sequence of numbers here." So the random sequence, whether it's ten bits or one billion bits, does not represent a significant amount of complexity, because it is characterized by a simple instruction. This is the difference between a random sequence and an unpredictable sequence of information that has purpose.

To gain some further insight into the nature of complexity, consider the complexity of a rock. If we were to characterize all of the properties (precise location, angular momentum, spin, velocity, and so on) of every atom in the rock, we would have a vast amount of information. A one-kilogram (2.2-pound) rock has 1025 atoms which, as I will discuss in the next chapter, can hold up to 10 atoms which, as I will discuss in the next chapter, can hold up to 1027 bits of information. That's one hundred million billion times more information than the genetic code of a human (even without compressing the genetic code). bits of information. That's one hundred million billion times more information than the genetic code of a human (even without compressing the genetic code).5 But for most common purposes, the bulk of this information is largely random and of little consequence. So we can characterize the rock for most purposes with far less information just by specifying its shape and the type of material of which it is made. Thus, it is reasonable to consider the complexity of an ordinary rock to be far less than that of a human even though the rock theoretically contains vast amounts of information. But for most common purposes, the bulk of this information is largely random and of little consequence. So we can characterize the rock for most purposes with far less information just by specifying its shape and the type of material of which it is made. Thus, it is reasonable to consider the complexity of an ordinary rock to be far less than that of a human even though the rock theoretically contains vast amounts of information.6 One concept of complexity is the minimum amount of meaningful, non-random, but unpredictable meaningful, non-random, but unpredictable information needed to characterize a system or process. information needed to characterize a system or process.

In Gell-Mann's concept, the AlC of a million-bit random string would be about a million bits long. So I am adding to Gell-Mann's AlC concept the idea of replacing each random string with a simple instruction to "put random bits" here.



However, even this is not sufficient. Another issue is raised by strings of arbitrary data, such as names and phone numbers in a phone book, or periodic measurements of radiation levels or temperature. Such data is not random, and data-compression methods will only succeed in reducing it to a small degree. Yet it does not represent complexity as that term is generally understood. It is just data. So we need another simple instruction to "put arbitrary data sequence" here.

To summarize my proposed measure of the complexity of a set of information, we first consider its AlC as Gell-Mann has defined it. We then replace each random string with a simple instruction to insert a random string. We then do the same for arbitrary data strings. Now we have a measure of complexity that reasonably matches our intuition.

It is a fair observation that paradigm s.h.i.+fts in an evolutionary process such as biology-and its continuation through technology-each represent an increase in complexity, as I have defined it above. For example, the evolution of DNA allowed for more complex organisms, whose biological information processes could be controlled by the DNA molecule's flexible data storage. The Cambrian explosion provided a stable set of animal body plans (in DNA), so that the evolutionary process could concentrate on more complex cerebral development. In technology, the invention of the computer provided a means for human civilization to store and manipulate ever more complex sets of information. The extensive interconnectedness of the Internet provides for even greater complexity.

"Increasing complexity" on its own is not, however, the ultimate goal or end-product of these evolutionary processes. Evolution results in better answers, not necessarily more complicated ones. Sometimes a superior solution is a simpler one. So let's consider another concept: order. Order is not the same as the opposite of disorder. If disorder represents a random sequence of events, the opposite of disorder should be "not randomness." Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism or the bits in a computer program. "Noise," on the other hand, is a random sequence. Noise is inherently unpredictable but carries no information. Information, however, is also unpredictable. If we can predict future data from past data, that future data stops being information. Thus, neither information nor noise can be compressed (and restored to exactly the same sequence). We might consider a predictably alternating pattern (such as 0101010...) to be orderly, but it carries no information beyond the first couple of bits.

Thus, orderliness does not const.i.tute order, because order requires information. Order is information that fits a purpose. The measure of order is the measure of how well the information fits the purpose. Order is information that fits a purpose. The measure of order is the measure of how well the information fits the purpose. In the evolution of lifeforms, the purpose is to survive. In an evolutionary algorithm (a computer program that simulates evolution to solve a problem) applied to, say, designing a jet engine, the purpose is to optimize engine performance, efficiency, and possibly other criteria. In the evolution of lifeforms, the purpose is to survive. In an evolutionary algorithm (a computer program that simulates evolution to solve a problem) applied to, say, designing a jet engine, the purpose is to optimize engine performance, efficiency, and possibly other criteria.7 Measuring order is more difficult than measuring complexity. There are proposed measures of complexity, as I discussed above. For order, we need a measure of "success" that would be tailored to each situation. When we create evolutionary algorithms, the programmer needs to provide such a success measure (called the "utility function"). In the evolutionary process of technology development, we could a.s.sign a measure of economic success. Measuring order is more difficult than measuring complexity. There are proposed measures of complexity, as I discussed above. For order, we need a measure of "success" that would be tailored to each situation. When we create evolutionary algorithms, the programmer needs to provide such a success measure (called the "utility function"). In the evolutionary process of technology development, we could a.s.sign a measure of economic success.

Simply having more information does not necessarily result in a better fit. Sometimes, a deeper order-a better fit to a purpose-is achieved through simplification rather than further increases in complexity. For example, a new theory that ties together apparently disparate ideas into one broader, more coherent theory reduces complexity but nonetheless may increase the "order for a purpose." (In this case, the purpose is to accurately model observed phenomena.) Indeed, achieving simpler theories is a driving force in science. (As Einstein said, "Make everything as simple as possible, but no simpler.") An important example of this concept is one that represented a key step in the evolution of hominids: the s.h.i.+ft in the thumb's pivot point, which allowed more precise manipulation of the environment."8 Primates such as chimpanzees can grasp but they cannot manipulate objects with either a "power grip," or sufficient fine-motor coordination to write or to shape objects. A change in the thumb's pivot point did not significantly increase the complexity of the animal but nonetheless did represent an increase in order, enabling, among other things, the development of technology. Evolution has shown, however, that the general trend toward greater order does typically result in greater complexity. Primates such as chimpanzees can grasp but they cannot manipulate objects with either a "power grip," or sufficient fine-motor coordination to write or to shape objects. A change in the thumb's pivot point did not significantly increase the complexity of the animal but nonetheless did represent an increase in order, enabling, among other things, the development of technology. Evolution has shown, however, that the general trend toward greater order does typically result in greater complexity.9 Thus improving a solution to a problem-which usually increases but sometimes decreases complexity-increases order. Now we are left with the issue of defining the problem. Indeed, the key to an evolutionary algorithm (and to biological and technological evolution in general) is exactly this: defining the problem (which includes the utility function). In biological evolution the overall problem has always been to survive. In particular ecological niches this overriding challenge translates into more specific objectives, such as the ability of certain species to survive in extreme environments or to camouflage themselves from predators. As biological evolution moved toward humanoids, the objective itself evolved to the ability to outthink adversaries and to manipulate the environment accordingly.

It may appear that this aspect of the law of accelerating returns contradicts the second law of thermodynamics, which implies that entropy (randomness in a closed system) cannot decrease and, therefore, generally increases.10 However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order. Even a crisis, such as the periodic large asteroids that have crashed into the Earth, although increasing chaos temporarily, end up increasing-deepening-the order created by biological evolution. However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order. Even a crisis, such as the periodic large asteroids that have crashed into the Earth, although increasing chaos temporarily, end up increasing-deepening-the order created by biological evolution.

To summarize, evolution increases order, which mayor may not increase complexity (but usually does). A primary reason that evolution-of life-forms or of technology-speeds up is that it builds on its own increasing order, with ever more sophisticated means of recording and manipulating information. Innovations created by evolution encourage and enable faster evolution. In the case of the evolution of life-forms, the most notable early example is DNA, which provides a recorded and protected transcription of life's design from which to launch further experiments. In the case of the evolution of technology, ever-improving human methods of recording information have fostered yet further advances in technology. The first computers were designed on paper and a.s.sembled by hand. Today, they are designed on computer workstations, with the computers themselves working out many details of the next generation's design, and are then produced in fully automated factories with only limited human intervention.

The evolutionary process of technology improves capacities in an exponential fas.h.i.+on. Innovators seek to improve capabilities by multiples. Innovation is multiplicative, not additive. Technology, like any evolutionary process, builds on itself. This aspect will continue to accelerate when the technology itself takes full control of its own progression in Epoch Five.11 We can summarize the principles of the law of accelerating returns as follows:

Evolution applies positive feedback: the more capable methods resulting from one stage of evolutionary progress are used to create the next stage. As described in the previous chapter, each epoch of evolution has progressed more rapidly by building on the products of the previous stage. Evolution works through indirection: evolution created humans, humans created technology, humans are now working with increasingly advanced technology to create new generations of technology. By the time of the Singularity, there won't be a distinction between humans and technology. This is not because humans will have become what we think of as machines today, but rather machines will have progressed to be like humans and beyond. This is not because humans will have become what we think of as machines today, but rather machines will have progressed to be like humans and beyond. Technology will be the metaphorical opposable thumb that enables our next step in evolution. Progress (further increases in order) will then be based on thinking processes that occur at the speed of light rather than in very slow electrochemical reactions. Each stage of evolution builds on the fruits of the last stage, so the rate of progress of an evolutionary process increases at least exponentially over time. Over time, the "order" of the information embedded in the evolutionary process (the measure of how well the information fits a purpose, which in evolution is survival) increases. Technology will be the metaphorical opposable thumb that enables our next step in evolution. Progress (further increases in order) will then be based on thinking processes that occur at the speed of light rather than in very slow electrochemical reactions. Each stage of evolution builds on the fruits of the last stage, so the rate of progress of an evolutionary process increases at least exponentially over time. Over time, the "order" of the information embedded in the evolutionary process (the measure of how well the information fits a purpose, which in evolution is survival) increases.An evolutionary process is not a closed system; evolution draws upon the chaos in the larger system in which it takes place for its options for diversity. Because evolution also builds on its own increasing order, in an evolutionary process order increases exponentially.A correlate of the above observation is that the "returns" of an evolutionary process (such as the speed, efficiency, cost-effectiveness, or overall "power" of a process) also increase at least exponentially over time. We see this in Moore's Law, in which each new generation of computer chip (which now appears approximately every two years) provides twice as many components per unit cost, each of which operates substantially faster (because of the smaller distances required for the electrons to travel within and between them and other factors). As I ill.u.s.trate below, this exponential growth in the power and price-performance of information-based technologies is not limited to computers but is true for essentially all information technologies and includes human knowledge, measured many different ways. It is also important to note that the term "information technology" is encompa.s.sing an increasingly broad cla.s.s of phenomena and will ultimately include the full range of economic activity and cultural endeavor.In another positive-feedback loop, the more effective a particular evolutionary process becomes-for example, the higher the capacity and cost-effectiveness that computation attains-the greater the amount of resources that are deployed toward the further progress of that process. This results in a second level of exponential growth; that is, the rate of exponential growth-the exponent-itself grows exponentially. For example, as seen in the figure on p. 67, "Moore's Law: The Fifth Paradigm," it took three years to double the price-performance of computation at the beginning of the twentieth century and two years in the middle of the century. It is now doubling about once per year. Not only is each chip doubling in power each year for the same unit cost, but the number of chips being manufactured is also growing exponentially; thus, computer research budgets have grown dramatically over the decades.Biological evolution is one such evolutionary process. Indeed, it is the quintessential evolutionary process. Because it took place in a completely open system (as opposed to the artificial constraints in an evolutionary algorithm), many levels of the system evolved at the same time. Not only does the information contained in a species' genes progress toward greater order, but the overall system implementing the evolutionary process itself evolves in this way. For example, the number of chromosomes and the sequence of genes on the chromosomes have also evolved over time. As another example, evolution has developed ways to protect genetic information from excessive defects (although a small amount of mutation is allowed, since this is a beneficial mechanism for ongoing evolutionary improvement). One primary means of achieving this is the repet.i.tion of genetic information on paired chromosomes. This guarantees that, even if a gene on one chromosome is damaged, its corresponding gene is likely to be correct and effective. Even the unpaired male Y chromosome has devised means of backing up its information by repeating it on the Y chromosome itself.12 Only about 2 percent of the genome codes for proteins. Only about 2 percent of the genome codes for proteins.13 The rest of the genetic information has evolved elaborate means to control when and how the protein-coding genes express themselves (produce proteins) in a process we are only beginning to understand. Thus, the process of evolution, such as the allowed rate of mutation, has itself evolved over time. The rest of the genetic information has evolved elaborate means to control when and how the protein-coding genes express themselves (produce proteins) in a process we are only beginning to understand. Thus, the process of evolution, such as the allowed rate of mutation, has itself evolved over time.Technological evolution is another such evolutionary process. Indeed, the emergence of the first technology-creating species resulted in the new evolutionary process of technology, which makes technological evolution an outgrowth of-and a continuation of-biological evolution. h.o.m.o sapiens h.o.m.o sapiens evolved over the course of a few hundred thousand years, and early stages of humanoid-created technology (such as the wheel, fire, and stone tools) progressed barely faster, requiring tens of thousands of years to evolve and be widely deployed. A half millennium ago, the product of a paradigm s.h.i.+ft such as the printing press took about a century to be widely deployed. Today, the products of major paradigm s.h.i.+fts, such as cell phones and the World Wide Web, are widely adopted in only a few years' time. evolved over the course of a few hundred thousand years, and early stages of humanoid-created technology (such as the wheel, fire, and stone tools) progressed barely faster, requiring tens of thousands of years to evolve and be widely deployed. A half millennium ago, the product of a paradigm s.h.i.+ft such as the printing press took about a century to be widely deployed. Today, the products of major paradigm s.h.i.+fts, such as cell phones and the World Wide Web, are widely adopted in only a few years' time.A specific paradigm (a method or approach to solving a problem; for example, shrinking transistors on an integrated circuit as a way to make more powerful computers) generates exponential growth until its potential is exhausted. When this happens, a paradigm s.h.i.+ft occurs, which enables exponential growth to continue.

The Life Cycle of a Paradigm. Each paradigm develops in three stages: Each paradigm develops in three stages:

1.Slow growth (the early phase of exponential growth) Slow growth (the early phase of exponential growth)2.Rapid growth (the late, explosive phase of exponential growth), as seen in the S-curve figure below Rapid growth (the late, explosive phase of exponential growth), as seen in the S-curve figure below3.A leveling off as the particular paradigm matures A leveling off as the particular paradigm matures

The progression of these three stages looks like the letter S, stretched to the right. The S-curve ill.u.s.tration shows how an ongoing exponential trend can be composed of a cascade of S-curves. Each successive S-curve is faster (takes less time on the time, or x x, axis) and higher (takes up more room on the performance, or y y, axis).

[image]

S-curves are typical of biological growth: replication of a system of relatively fixed complexity (such as an organism of a particular species), operating in a compet.i.tive niche and struggling for finite local resources. This often occurs, for example, when a species happens upon a new hospitable environment. Its numbers will grow exponentially for a while before leveling off. The overall exponential growth of an evolutionary process (whether molecular, biological, cultural, or technological) supersedes the limits to growth seen in any particular paradigm (a specific S-curve) as a result of the increasing power and efficiency developed in each successive paradigm. The exponential growth of an evolutionary process, therefore, spans multiple S-curves. The most important contemporary example of this phenomenon is the five paradigms of computation discussed below. The entire progression of evolution seen in the charts on the acceleration of paradigm s.h.i.+ft in the previous chapter represents successive S-curves. Each key event, such as writing or printing, represents a new paradigm and a new S-curve.

The evolutionary theory of punctuated equilibrium (PE) describes evolution as progressing through periods of rapid change followed by periods of relative stasis.14 Indeed, the key events on the epochal-event graphs do correspond to renewed periods of exponential increase in order (and, generally, of complexity), followed by slower growth as each paradigm approaches its asymptote (limit of capability). So PE does provide a better evolutionary model than a model that predicts only smooth progression through paradigm s.h.i.+fts. Indeed, the key events on the epochal-event graphs do correspond to renewed periods of exponential increase in order (and, generally, of complexity), followed by slower growth as each paradigm approaches its asymptote (limit of capability). So PE does provide a better evolutionary model than a model that predicts only smooth progression through paradigm s.h.i.+fts.

But the key events in punctuated equilibrium, while giving rise to more rapid change, don't represent instantaneous jumps. For example, the advent of DNA allowed a surge (but not an immediate jump) of evolutionary improvement in organism design and resulting increases in complexity. In recent technological history, the invention of the computer initiated another surge, still ongoing, in the complexity of information that the human-machine civilization is capable of handling. This latter surge will not reach an asymptote until we saturate the matter and energy in our region of the universe with computation, based on physical limits we'll discuss in the section "... on the Intelligent Destiny of the Cosmos" in chapter 6.15 During this third or maturing phase in the life cycle of a paradigm, pressure begins to build for the next paradigm s.h.i.+ft. In the case of technology, research dollars are invested to create the next paradigm. We can see this in the extensive research being conducted today toward three-dimensional molecular computing, despite the fact that we still have at least a decade left for the paradigm of shrinking transistors on a flat integrated circuit using photolithography.

Generally, by the time a paradigm approaches its asymptote in price-performance, the next technical paradigm is already working in niche applications. For example, in the 1950s engineers were shrinking vacuum tubes to provide greater price-performance for computers, until the process became no longer feasible. At this point, around 1960, transistors had already achieved a strong niche market in portable radios and were subsequently used to replace vacuum tubes in computers.

The resources underlying the exponential growth of an evolutionary process are relatively unbounded. One such resource is the (ever-growing) order of the evolutionary process itself (since, as I pointed out, the products of an evolutionary process continue to grow in order). Each stage of evolution provides more powerful tools for the next. For example, in biological evolution, the advent of DNA enabled more powerful and faster evolutionary "experiments." Or to take a more recent example, the advent of computer-a.s.sisted design tools allows rapid development of the next generation of computers.

The other required resource for continued exponential growth of order is the "chaos" of the environment in which the evolutionary process takes place and which provides the options for further diversity. The chaos provides the variability to permit an evolutionary process to discover more powerful and efficient solutions. In biological evolution, one source of diversity is the mixing and matching of gene combinations through s.e.xual reproduction. s.e.xual reproduction itself was an evolutionary innovation that accelerated the entire process of biological adaptation and provided for greater diversity of genetic combinations than nons.e.xual reproduction. Other sources of diversity are mutations and ever-changing environmental conditions. In technological evolution, human ingenuity combined with variable market conditions keeps the process of innovation going.

Fractal Designs. A key question concerning the information content of biological systems is how it is possible for the genome, which contains comparatively little information, to produce a system such as a human, which is vastly more complex than the genetic information that describes it. One way of understanding this is to view the designs of biology as "probabilistic fractals." A deterministic fractal is a design in which a single design element (called the "initiator") is replaced with multiple elements (together called the "generator"). In a second iteration of fractal expansion, each element in the generator itself becomes an initiator and is replaced with the elements of the generator (scaled to the smaller size of the second-generation initiators). This process is repeated many times, with each newly created element of a generator becoming an initiator and being replaced with a new scaled generator. Each new generation of fractal expansion adds apparent complexity but requires no additional design information. A probabilistic fractal adds the element of uncertainty. Whereas a deterministic fractal will look the same every time it is rendered, a probabilistic fractal will look different each time, although with similar characteristics. In a probabilistic fractal, the probability of each generator element being applied is less than 1. In this way, the resulting designs have a more organic appearance. Probabilistic fractals are used in graphics programs to generate realistic-looking images of mountains, clouds, seash.o.r.es, foliage, and other organic scenes. A key aspect of a probabilistic fractal is that it enables the generation of a great deal of apparent complexity, including extensive varying detail, from a relatively small amount of design information. Biology uses this same principle. Genes supply the design information, but the detail in an organism is vastly greater than the genetic design information. A key question concerning the information content of biological systems is how it is possible for the genome, which contains comparatively little information, to produce a system such as a human, which is vastly more complex than the genetic information that describes it. One way of understanding this is to view the designs of biology as "probabilistic fractals." A deterministic fractal is a design in which a single design element (called the "initiator") is replaced with multiple elements (together called the "generator"). In a second iteration of fractal expansion, each element in the generator itself becomes an initiator and is replaced with the elements of the generator (scaled to the smaller size of the second-generation initiators). This process is repeated many times, with each newly created element of a generator becoming an initiator and being replaced with a new scaled generator. Each new generation of fractal expansion adds apparent complexity but requires no additional design information. A probabilistic fractal adds the element of uncertainty. Whereas a deterministic fractal will look the same every time it is rendered, a probabilistic fractal will look different each time, although with similar characteristics. In a probabilistic fractal, the probability of each generator element being applied is less than 1. In this way, the resulting designs have a more organic appearance. Probabilistic fractals are used in graphics programs to generate realistic-looking images of mountains, clouds, seash.o.r.es, foliage, and other organic scenes. A key aspect of a probabilistic fractal is that it enables the generation of a great deal of apparent complexity, including extensive varying detail, from a relatively small amount of design information. Biology uses this same principle. Genes supply the design information, but the detail in an organism is vastly greater than the genetic design information.

Some observers misconstrue the amount of detail in biological systems such as the brain by arguing, for example, that the exact configuration of every microstructure (such as each tubule) in each neuron is precisely designed and must be exactly the way it is for the system to function. In order to understand how a biological system such as the brain works, however, we need to understand its design principles, which are far simpler (that is, contain far less information) than the extremely detailed structures that the genetic information generates through these iterative, fractal-like processes. There are only eight hundred million bytes of information in the entire human genome, and only about thirty to one hundred million bytes after data compression is applied. This is about one hundred million times less information than is represented by all of the interneuronal connections and neurotransmitter concentration patterns in a fully formed human brain.

Consider how the principles of the law of accelerating returns apply to the epochs we discussed in the first chapter. The combination of amino acids into proteins and of nucleic acids into strings of RNA established the basic paradigm of biology. Strings of RNA (and later DNA) that self-replicated (Epoch Two) provided a digital method to record the results of evolutionary experiments. Later on, the evolution of a species that combined rational thought (Epoch Three) with an opposable appendage (the thumb) caused a fundamental paradigm s.h.i.+ft from biology to technology (Epoch Four). The upcoming primary paradigm s.h.i.+ft will be from biological thinking to a hybrid combining biological and nonbiological thinking (Epoch Five), which will include "biologically inspired" processes resulting from the reverse engineering of biological brains.

If we examine the timing of these epochs, we see that they have been part of a continuously accelerating process. The evolution of life-forms required billions of years for its first steps (primitive cells, DNA), and then progress accelerated. During the Cambrian explosion, major paradigm s.h.i.+fts took only tens of millions of years. Later, humanoids developed over a period of millions of years, and h.o.m.o sapiens h.o.m.o sapiens over a period of only hundreds of thousands of years. With the advent of a technology-creating species the exponential pace became too fast for evolution through DNA-guided protein synthesis, and evolution moved on to human-created technology. This does not imply that biological (genetic) evolution is not continuing, just that it is no longer leading the pace in terms of improving order (or of the effectiveness and efficiency of computation). over a period of only hundreds of thousands of years. With the advent of a technology-creating species the exponential pace became too fast for evolution through DNA-guided protein synthesis, and evolution moved on to human-created technology. This does not imply that biological (genetic) evolution is not continuing, just that it is no longer leading the pace in terms of improving order (or of the effectiveness and efficiency of computation).16 Farsighted Evolution. There are many ramifications of the increasing order and complexity that have resulted from biological evolution and its continuation through technology. Consider the boundaries of observation. Early biological life could observe local events several millimeters away, using chemical gradients. When sighted animals evolved, they were able to observe events that were miles away. With the invention of the telescope, humans could see other galaxies millions of light-years away. Conversely, using microscopes, they could also see cellular-size structures. Today humans armed with contemporary technology can see to the edge of the observable universe, a distance of more than thirteen billion light-years, and down to quantum-scale subatomic particles. There are many ramifications of the increasing order and complexity that have resulted from biological evolution and its continuation through technology. Consider the boundaries of observation. Early biological life could observe local events several millimeters away, using chemical gradients. When sighted animals evolved, they were able to observe events that were miles away. With the invention of the telescope, humans could see other galaxies millions of light-years away. Conversely, using microscopes, they could also see cellular-size structures. Today humans armed with contemporary technology can see to the edge of the observable universe, a distance of more than thirteen billion light-years, and down to quantum-scale subatomic particles.

Consider the duration of observation. Single-cell animals could remember events for seconds, based on chemical reactions. Animals with brains could remember events for days. Primates with culture could pa.s.s down information through several generations. Early human civilizations with oral histories were able to preserve stories for hundreds of years. With the advent of written language the permanence extended to thousands of years.

As one of many examples of the acceleration of the technology paradigm-s.h.i.+ft rate, it took about a half century for the late-nineteenth-century invention of the telephone to reach significant levels of usage (see the figure below).17 In comparison, the late-twentieth-century adoption of the cell phone took only a decade.18 Overall we see a smooth acceleration in the adoption rates of communication technologies over the past century.19 As discussed in the previous chapter, the overall rate of adopting new paradigms, which parallels the rate of technological progress, is currently doubling every decade. That is, the time to adopt new paradigms is going down by half each decade. At this rate, technological progress in the twenty-first century will be equivalent (in the linear view) to two hundred centuries of progress (at the rate of progress in 2000).20, 21

The S-Curve of a Technology as Expressed in its Life Cycle

A machine is as distinctively and brilliantly and expressively human as a violin sonota or a theorem in Euclid.-GREGORY VLASTOSIt is a far cry from the monkish calligrapher, working in his cell in silence, to the brisk "click, click" of the modern writing machine, which in a quarter of a century has revolutionized and reformed business.-SCIENTIFIC AMERICAN, 1905No communication technology has ever disappeared but instead becomes increasingly less important as the technological horizon widens.-Arthur C. Clarke I always keep a stack of books on my desk that I leaf through when I run out of ideas, feel restless, or otherwise need a shot of inspiration. Picking up a fat volume that I recently acquired, I consider the bookmaker's craft: 470 finely printed pages organized into 16-page signatures, all of which are sewn together with white thread and glued onto a gray canvas cord. The hard linen-bound covers, stamped with gold letters, are connected to the signature block by delicately embossed end sheets. This is a technology that was perfected many decades ago. Books const.i.tute such an integral element of our society-both reflecting and shaping its culture-that it is hard to imagine life without them. But the printed book, like any other technology, will not live forever.

The Life Cycle of a Technology We can identify seven distinct stages in the life cycle of a technology.

1. During the precursor stage, the prerequisites of a technology exist, and dreamers may contemplate these elements coming together. We do not, however, regard dreaming to be the same as inventing, even if the dreams are written down. Leonardo da Vinci drew convincing pictures of airplanes and automobiles, but he is not considered to have invented either. During the precursor stage, the prerequisites of a technology exist, and dreamers may contemplate these elements coming together. We do not, however, regard dreaming to be the same as inventing, even if the dreams are written down. Leonardo da Vinci drew convincing pictures of airplanes and automobiles, but he is not considered to have invented either.2. The next stage, one highly celebrated in our culture, is invention, a very brief stage, similar in some respects to the process of birth after an extended period of labor. Here the inventor blends curiosity, scientific skills, determination, and usually of showmans.h.i.+p to combine methods in a new way and brings a new technology to life. The next stage, one highly celebrated in our culture, is invention, a very brief stage, similar in some respects to the process of birth after an extended period of labor. Here the inventor blends curiosity, scientific skills, determination, and usually of showmans.h.i.+p to combine methods in a new way and brings a new technology to life.3. The next stage is development, during which the invention is protected and supported by doting guardians (who may include the original inventor). Often this stage is more crucial than invention and may involve additional creation that can have greater significance than the invention itself. Many tinkerers had constructed finely hand-tuned horseless carriages, but it was Henry Ford's innovation of ma.s.s production that enabled the automobile to take root and flourish. The next stage is development, during which the invention is protected and supported by doting guardians (who may include the original inventor). Often this stage is more crucial than invention and may involve additional creation that can have greater significance than the invention itself. Many tinkerers had constructed finely hand-tuned horseless carriages, but it was Henry Ford's innovation of ma.s.s production that enabled the automobile to take root and flourish.4. The fourth stage is maturity. Although continuing to evolve, the technology now has a life of its own and has become an established part of the community. It may become to intertwined in the fabric of life that it appears to many observers that it will last forever. This creates an interesting drama when the next stage arrives, which I call the stage of the false pretenders. The fourth stage is maturity. Although continuing to evolve, the technology now has a life of its own and has become an established part of the community. It may become to intertwined in the fabric of life that it appears to many observers that it will last forever. This creates an interesting drama when the next stage arrives, which I call the stage of the false pretenders.5. Here an upstart threatens to eclipse the older technology. Its enthusiasts prematurely predict victory. While providing some distinct benefits, the newer technology is found on reflection to be lacking some key element of functionality or quality. When it indeed fails to dislodge the established order, the technology conservatives take this as evidence that the original approach will indeed live forever. Here an upstart threatens to eclipse the older technology. Its enthusiasts prematurely predict victory. While providing some distinct benefits, the newer technology is found on reflection to be lacking some key element of functionality or quality. When it indeed fails to dislodge the established order, the technology conservatives take this as evidence that the original approach will indeed live forever.6. This is usually a short-lived victory for the aging technology. Shortly thereafter, another new technology typically does succeed in rendering the original technology to the stage of obsolescence. In this part of the life cycle, the technology lives out its senior years in gradual decline, its original purpose and functionality now subsumed by a more spry compet.i.tor. This is usually a short-lived victory for the aging technology. Shortly thereafter, another new technology typically does succeed in rendering the original technology to the stage of obsolescence. In this part of the life cycle, the technology lives out its senior years in gradual decline, its original purpose and functionality now subsumed by a more spry compet.i.tor.7. In this stage, which may comprise 5 to 10 percent of a technology's life cycle, it finally yields to antiquity (as did the horse and buggy) the harpsichord, the vinyl record, and the manual typewriter). In this stage, which may comprise 5 to 10 percent of a technology's life cycle, it finally yields to antiquity (as did the horse and buggy) the harpsichord, the vinyl record, and the manual typewriter).

In the mid-nineteenth century there were several precursors to the phonograph, including Leon Scott de Martinville's phonautograph, a device that recorded sound vibrations as a printed pattern. It was Thomas Edison, however, who brought all of the elements together and invented the first device that could both record and reproduce sound in 1877. Further refinements were necessary for the phonograph to become commercially viable. It became a fully mature technology in 1949 when Columbia introduced the 33-rpm long-playing recording (LP) and RCA introduced the 45-rpm disc. The false pretender was the ca.s.sette tape, introduced in the 1960s and popularized during the 1970s. Early enthusiasts predicted that its small size and ability to be rerecorded would make the relatively bulky and scratchable record obsolete.

Despite these benefits, ca.s.settes lack random access and are p.r.o.ne to their own forms of distortion and lack fidelity. The compact disc (CD) delivered the mortal blow. With the CD providing both random access and a level of quality close to the limits of the human auditory system, the phonograph quickly entered the stage of obsolescence. Although still produced, the technology that Edison gave birth to almost 130 years ago has now reached antiquity.

Consider the piano, an area of technology that I have been personally involved with replicating. In the early eighteenth century Bartolommeo Cristifori was seeking a way to provide a touch response to the then-popular harpsichord so that the volume of the notes would vary with the intensity of the touch of the performer. Called gravicembalo cal piano e forte gravicembalo cal piano e forte ("harpsichord with soft and loud"), his invention was not an immediate success. Further refinements, including Stein's Viennese action and Zumpe's English action, helped to establish the "piano" as the preeminent keyboard instrument. It reached maturity with the development of the complete cast-iron frame, patented in 1825 by Alpheus Babc.o.c.k, and has seen only subtle refinements since then. The false pretender was the electric piano of the early 1980s. It offered substantially greater functionality. Compared to the single (piano) sound of the acoustic piano, the electronic variant offered dozens of instrument sounds, sequencers that allowed the user to play an entire orchestra at once, automated accompaniment, education program to teach keyboard skills, and many other features. The only feature it was missing was a good-quality piano sound. ("harpsichord with soft and loud"), his invention was not an immediate success. Further refinements, including Stein's Viennese action and Zumpe's English action, helped to establish the "piano" as the preeminent keyboard instrument. It reached maturity with the development of the complete cast-iron frame, patented in 1825 by Alpheus Babc.o.c.k, and has seen only subtle refinements since then. The false pretender was the electric piano of the early 1980s. It offered substantially greater functionality. Compared to the single (piano) sound of the acoustic piano, the electronic variant offered dozens of instrument sounds, sequencers that allowed the user to play an entire orchestra at once, automated accompaniment, education program to teach keyboard skills, and many other features. The only feature it was missing was a good-quality piano sound.

This crucial flaw and the resulting failure of the first generation of electronic pianos led to the widespread conclusion that the piano would never be replaced by electronics. But the "victory" of the acoustic piano will not be permanent. With their far greater range of features and price-performance, digital pianos already exceed sales of acoustic pianos in homes. Many observers feel that the quality of the "piano" sound on digital pianos now equals or exceeds that of the upright acoustic piano. With the exception of concert and luxury grand pianos (a small part of the market), the sale of acoustic pianos is in decline.

From Goat Skins to Downloads So where in the technology life cycle is the book? Among its precursors were Mesopotamian clay tablets and Egyptian papyrus scrolls. In the second century B.C., the Ptolemies of Egypt created a great library of scrolls at Alexandria and outlawed the export of papyrus to discourage compet.i.tion.

What were perhaps the first books were created by Eumenes II, ruler of ancient Greek Perganum, using pages of vellum made from the skins of goats and sheep, which were sewn together between wooden covers. This technique enabled Eumenes to compile a library equal to that of Alexandria. Around the same time, the Chinese has also developed a crude form of book made from bamboo strips.

The development and maturation of books has involved three great advances. Printing, first experimented with by the Chinese in the eight century A.D. using raised wood blocks, allowed books to be reproduced in much larger quant.i.ties, expanding their audience beyond government and religious leaders. Of even greater significance was the advent of movable type, which the Chinese and Koreans experimented with by the eleventh century, but the complexity of Asian characters prevented these early attempts from being fully successful. Johannes Gutenberg working in the fifteenth century, benefited from the relative simplicity of the Roman character set. He produced his Bible, the first large-scale work printed entirely with movable type, in 1455.

While there has been a continual stream of evolutionary improvements in the mechanical and electromechanical process of printing, the technology of bookmaking did not see another qualitative leap until the availability of computer typesetting, which did away with movable type about two decades ago. Typography is now regarded as a part of digital image processing.

With books a fully mature technology, the false pretenders arrived about twenty years ago with the first wave of "electronic books." As is usually the case, these false pretenders offered dramatic qualitative and quant.i.tative benefits. CD-ROM- or flash memory-based electronic books can provide the equivalent of thousands of books with powerful computer-based search and knowledge navigation features. With Web- or CD-ROM- and DVD-based encyclopedias, I can perform rapid word searches using extensive logic rules, something is just not possible with the thirty-three-volume "book" version I possess. Electronic books can provide pictures that are animated and that respond to our input. Pages are not necessarily ordered sequentially but can be explored along more intuitive connections.

As with the phonograph record and the piano, this first generation of false pretenders was (and still is) missing an essential quality of the original, which in this case is the superb visual characteristics of paper and ink. Paper does not flicker, whereas the typical computer screen is displaying sixty or more fields per second. This is a problem because of an evolutionary adaptation of the primate visual system. We are able to see only a very small portion of the visual field with high resolution. This portion, imaged by the fovea in the retina, is focused on an area about the size of a single word at twenty-two inches away. Outside of the fovea, we have very little resolution but exquisite sensitivity to changes in brightness, an ability that allowed our primate forebears to quickly detect a predator that might be attacking. The constant flicker of a video graphics array (VGA) computer screen is detected by our eyes as motion and causes constant movement of the fovea. This substantially slows down reading speeds. which is one reason that reading on a screen is less pleasant than reading a printed book. This particular issue has been solved with flat-panel displays, which do not flicker.

Other crucial issues include contrast-a good-quality book has an ink-to-paper contrast of about 120:1; typical screens are perhaps half of that-and resolution. Print and ill.u.s.trations in a book represent a resolution of about 600 to 1000 dots per inch (dpi), while computer screens are about one tenth of that.

The size and weight of computerized devices are approaching those of books, but the devices are still heavier than a paperback book. Paper books also do not run out of battery power.

Most important, there is the matter of the available software, by which I mean the enormous installed base of printed books. Fifty thousand new print books are published each year in the United States, and millions of books are already in circulation. There are major efforts under way to scan and digitize print materials, but it will be a long time before the electronic databases have a comparable wealth of material. The biggest obstacle here is the understandable hesitation of publishers to make the electronic versions of their books available, given the devastating effect that illegal file sharing has had on the music-recording industry.

Solutions are emerging to each of these limitations. New, inexpensive display technologies have contrast, resolution, lack of flicker, and viewing and comparable to high-quality paper doc.u.ments. Fuel-cell power for portable electronics is being introduced, which will keep electronic devices powered for hundreds of hours between fuel-cartridge changes. Portable electronic devices are already comparable to the size and weight of a book. The primary issue is going to be finding secure means of making electronic information available. This is a fundamental concern for every level of our economy. Everything-including physical products, once nanotechnology-based manufacturing becomes a reality in about twenty years-is becoming information.

Moore's Law and Beyond

Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps weigh 1.5 tons.-POPULAR MECHANICS, 1949 Computer Science is no more about computers than astronomy is about telescopes.-E. W. DIJKSTRA

Before considering further the implications of the Singularity, let's examine the wide range of technologies that are subject to the law of accelerating returns. The exponential trend that has gained the greatest public recognition has become known as Moore's Law. In the mid-1970s, Gordon Moore, a leading inventor of integrated circuits and later chairman of Intel, observed that we could squeeze twice as many transistors onto an integrated circuit every twenty-four months (in the mid-1960s, he had estimated twelvemonths). Given that the electrons would consequently have less distance to travel, circuits would also run faster, providing an additional boost to overall computational power. The result is exponential growth in the price-performance of computation. This doubling rate-about twelve months-is much faster than the doubling rate for paradigm s.h.i.+ft that I spoke about earlier, which is about ten years. Typically, we find that the doubling time for different measures-price-performance, bandwidth, capacity-of the capability of information technology is about one year.

The primary driving force of Moore's Law is a reduction of semiconductor feature sizes, which shrink by half every 5.4 years in each dimension. (See the figure below.) Since chips are functionally two-dimensional, this means doubling the number of elements per square millimeter every 2.7 years.22 The following charts combine historical data with the semiconductor-industry road map (International Technology Roadmap for Semiconductors [ITRS] from Sematech), which projects through 2018.

The cost of DRAM (dynamic random access memory) per square millimeter has also been coming down. The doubling time for bits of DRAM per dollar has been only 1.5 years.23 A similar trend can be seen with transistors. You could buy one transistor for a dollar in 1968; in 2002 a dollar purchased about ten million transistors. Since DRAM is a specialized field that has seen its own innovation, the halving time for average transistor price is slightly slower than for DRAM, about 1.6 years (see the figure below).24 This remarkably smooth acceleration in price-performance of semiconductors has progressed through a series of stages of process technologies (defined by feature sizes) at ever smaller dimensions. The key feature size is now dipping below one hundred nanometers, which is considered the threshold of "nanotechnology."25 Unlike Gertrude Stein's rose, it is not the case that a transistor is a transistor is a transistor. As they have become smaller and less expensive, transistors have also become faster by a factor of about one thousand over the course of the past thirty years (see the figure below)-again, because the electrons have less distance to travel.26 If we combine the exponential trends toward less-expensive transistors and faster cycle times, we find a halving time of only 1.1 years in the cost per transistor cycle (see the figure belowl.27 The cost per transistor cycle is a more accurate overall measure of price-performance because it takes into account both speed and capacity. But the cost per transistor cycle still does not take into account innovation at higher levels of design (such as microprocessor design) that improves computational efficiency. The cost per transistor cycle is a more accurate overall measure of price-performance because it takes into account both speed and capacity. But the cost per transistor cycle still does not take into account innovation at higher levels of design (such as microprocessor design) that improves computational efficiency.

The number of transistors in Intel processors has doubled every two years (see the figure below). Several other factors have boosted price-performance, including clock speed, reduction in cost per microprocessor, and processor design innovations.28 Processor performance in MIPS has doubled every 1.8 years per processor (see the figure below). Again, note that the cost per processor has also declined through this period.29 If I examine my own four-plus decades of experience in this industry, I can compare the MIT computer I used as a student in the late 1960s to a recent notebook. In 1967 I had access to a multimillion-dollar IBM 7094 with 32K (36-bit) words of memory and a quarter of a MIPS processor speed. In 2004 I used a $2,000 personal computer with a half-billion bytes of RAM and a processor speed of about 2,000 MIPS. The MIT computer was about one thousand times more expensive, so the ratio of cost per MIPS is about eight million to one.

My recent computer provides 2,000 MIPS of processing at a cost that is about 224 lower than that of the computer I used in 1967. That's 24 doublings in 37 years, or about 18.5 months per doubling. If we factor in the increased value of the approximately 2,000 times greater RAM, vast increases in disk storage, and the more powerful instruction set of my circa 2004 computer, as well as vast improvements in communication speeds, more powerful software, and other factors, the doubling time comes down even further. lower than that of the computer I used in 1967. That's 24 doublings in 37 years, or about 18.5 months per doubling. If we factor in the increased value of the approximately 2,000 times greater RAM, vast increases in disk storage, and the more powerful instruction set of my circa 2004 computer, as well as vast improvements in communication speeds, more powerful software, and other factors, the doubling time comes down even further.

Despite this ma.s.sive deflation in the cost of information technologies, demand has more than kept up. The number of bits s.h.i.+pped has doubled every 1.1 years, faster than the halving time in cost per bit, which is 1.5 years.30 As a result, the semiconductor industry enjoyed 18 percent annual growth in total revenue from 1958 to 2002. As a result, the semiconductor industry enjoyed 18 percent annual growth in total revenue from 1958 to 2002.31 The entire information-technology (IT) industry has grown from 4.2 percent of the gross domestic product in 1977 to 8.2 percent in 1998. The entire information-technology (IT) industry has grown from 4.2 percent of the gross domestic product in 1977 to 8.2 percent in 1998.32 IT has become increasingly influential in all economic sectors. The share of value contributed by information technology for most categories of products and services is rapidly increasing. Even common manufactured products such as tables and chairs have an information content, represented by their computerized designs and the programming of the inventory-procurement systems and automated-fabrication systems used in their a.s.sembly. IT has become increasingly influential in all economic sectors. The share of value contributed by information technology for most categories of products and services is rapidly increasing. Even common manufactured products such as tables and chairs have an information content, represented by their computerized designs and the programming of the inventory-procurement systems and automated-fabrication systems used in their a.s.sembly.

[image]

Moore's Law: Self-Fulfilling Prophecy?

Some observers have stated that Moore's Law is nothing more than a self-fulfilling prophecy: that industry partic.i.p.ants antic.i.p.ate where they need to be at particular time in the future, and organize their research development accordingly. The industry's own written road map is a good example of this.34 However, the exponential trends in information technology are far broader than those covered by Moore's Law. We see the same type of trends in essentially every technology or measurement that deals with information. This includes many technologies in which a perception of accelerating price-performance does not exist or has not previously been articulated (see below). Even within computing itself, the growth in capability per unit cost is much broader than what Moore's Law alone would predict. However, the exponential trends in information technology are far broader than those covered by Moore's Law. We see the same type of trends in essentially every technology or measurement that deals with information. This includes many technologies in which a perception of accelerating price-performance does not exist or has not previously been articulated (see below). Even within computing itself, the growth in capability per unit cost is much broader than what Moore's Law alone would predict.

The Fifth Paradigm35 Moore's Law is actually not the first paradigm in computational systems. You can see this if you plot the price-performance-measured by instructions per second per thousand constant dollars-of forty-nine famous computational systems and computers spanning the twentieth century (see the figure below).

As the figure demonstrates, there were actually four different paradigms-electromechanical, relays, vacuum tubes, and discrete transistors-that showed exponential growth in the price-performance of computing long before integrated circuits were even invented. And Moore's paradigm won't be the last. When Moore's Law reaches the end of its S-curve, now expected before 2020, the exponential growth will continue with three-dimensional molecular computing, which will const.i.tute the sixth paradigm.

Fractal Dimensions and the Brain

Note that the use of the third dimension in computing systems is not an either-or choice but a continuum between two and three dimensions. In terms of biological intelligence, the human cortex is actually rather flat, with only six thin layers that are elaborately folded, an architecture that greatly increases the surface area. The folding one way to use the third dimensi

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

The Singularity Is Near_ When Humans Transcend Biology Part 2 summary

You're reading The Singularity Is Near_ When Humans Transcend Biology. This manga has been translated by Updating. Author(s): Ray Kurzweil. Already has 441 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

BestLightNovel.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to BestLightNovel.com