The Singularity Is Near_ When Humans Transcend Biology - BestLightNovel.com
You’re reading novel The Singularity Is Near_ When Humans Transcend Biology Part 30 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy
29. 29. Data from Intel Corporation. See also Gordon Moore, "No Exponential Is Forever ... but We Can Delay 'Forever,' " presented at the International Solid State Circuits Conference (lSSCC), February 10, 2003, ftp://download.intel.com/ researchl silicon/Gordon_Moore_ISSCC_ 021003.pdf. Data from Intel Corporation. See also Gordon Moore, "No Exponential Is Forever ... but We Can Delay 'Forever,' " presented at the International Solid State Circuits Conference (lSSCC), February 10, 2003, ftp://download.intel.com/ researchl silicon/Gordon_Moore_ISSCC_ 021003.pdf.
30. 30. Steve Cullen, "Semiconductor Industry Outlook," InStat/MDR, report no. IN0401550SI, April 2004, http://www.instat.com/abstract.asp?id=68&SKU=IN0401550SI. Steve Cullen, "Semiconductor Industry Outlook," InStat/MDR, report no. IN0401550SI, April 2004, http://www.instat.com/abstract.asp?id=68&SKU=IN0401550SI.
31. 31. World Semiconductor Trade Statistics, http://wsts.www5.kcom.at. World Semiconductor Trade Statistics, http://wsts.www5.kcom.at.
32. 32. Bureau of Economic a.n.a.lysis, U.S. Department of Commerce, http://www.bea.gov/bea/dn/home/gdp.htm. Bureau of Economic a.n.a.lysis, U.S. Department of Commerce, http://www.bea.gov/bea/dn/home/gdp.htm.
33. 33. See notes 2224 and 2630. See notes 2224 and 2630.
34. 34. International Technology Roadmap for Semiconductors, 2002 update, International Sematech. International Technology Roadmap for Semiconductors, 2002 update, International Sematech.
35. 35. "25 Years of Computer History," http://www.compros.com/timeline.html; Linley Gwennap, "Birth of a Chip," "25 Years of Computer History," http://www.compros.com/timeline.html; Linley Gwennap, "Birth of a Chip," BYTE BYTE (December 1996), http://www.byte.com/art/9612/sec6/art2.htm; "The CDC 6000 Series Computer," http://www.moorecad.com/standardpascal/cdc6400.html; "A Chronology of Computer History," http://www.cyberstreet.comlhcs/museum/chron.htm; Mark Brader, "A Chronology of Digital Computing Machines (to 1952)," http://www.davros.org/misc/chronology.html; Karl Kempf, "Electronic Computers Within the Ordnance Corps," November 1961, http://ftp.arl.mil/~mike/comphist/61ordnance/index.html; Ken Polsson, "Chronology of Personal Computers," http://www.islandnet.com/~kpolsson/comphist; "The History of Computing at Los Alamos," http://bang.lanl.gov/video/sunedu/computer/comphist.html (requires pa.s.sword); the Machine Room, http://www.machine-room.org; Mind Machine Web Museum, http://www.userwww.sfsu.edu/~hl/mmm.html; Hans Moravec, computer data, http://www.frc.ri.cmu.edu/~hpm/book97/ch3/processor.list; "PC Magazine Online: Fifteen Years of PC Magazine," http://www.pcmag.com/article2/0,1759,23390,00.asp; Stan Augarten, (December 1996), http://www.byte.com/art/9612/sec6/art2.htm; "The CDC 6000 Series Computer," http://www.moorecad.com/standardpascal/cdc6400.html; "A Chronology of Computer History," http://www.cyberstreet.comlhcs/museum/chron.htm; Mark Brader, "A Chronology of Digital Computing Machines (to 1952)," http://www.davros.org/misc/chronology.html; Karl Kempf, "Electronic Computers Within the Ordnance Corps," November 1961, http://ftp.arl.mil/~mike/comphist/61ordnance/index.html; Ken Polsson, "Chronology of Personal Computers," http://www.islandnet.com/~kpolsson/comphist; "The History of Computing at Los Alamos," http://bang.lanl.gov/video/sunedu/computer/comphist.html (requires pa.s.sword); the Machine Room, http://www.machine-room.org; Mind Machine Web Museum, http://www.userwww.sfsu.edu/~hl/mmm.html; Hans Moravec, computer data, http://www.frc.ri.cmu.edu/~hpm/book97/ch3/processor.list; "PC Magazine Online: Fifteen Years of PC Magazine," http://www.pcmag.com/article2/0,1759,23390,00.asp; Stan Augarten, Bit by Bit: An Ill.u.s.trated History of Computers Bit by Bit: An Ill.u.s.trated History of Computers (New York: Ticknor and Fields, 1984); International a.s.sociation of Electrical and Electronics Engineers (IEEE), (New York: Ticknor and Fields, 1984); International a.s.sociation of Electrical and Electronics Engineers (IEEE), Annals of the History of the Computer Annals of the History of the Computer 9.2 (1987): 15053 and 16.3 (1994): 20; Hans Moravec, 9.2 (1987): 15053 and 16.3 (1994): 20; Hans Moravec, Mind Children: The Future of Robot and Human Intelligence Mind Children: The Future of Robot and Human Intelligence (Cambridge, Ma.s.s.: Harvard University Press, 1988); Rene Moreau, (Cambridge, Ma.s.s.: Harvard University Press, 1988); Rene Moreau, The Computer Comes of Age The Computer Comes of Age (Cambridge, Ma.s.s.: MIT Press, 1984). (Cambridge, Ma.s.s.: MIT Press, 1984).
36. 36. The plots in this chapter labeled "Logarithmic Plot" are technically semilogarithmic plots in that one axis (time) is on a linear scale, and the other axis is on a logarithmic scale. However, I am calling these plots "logarithmic plots" for simplicity. The plots in this chapter labeled "Logarithmic Plot" are technically semilogarithmic plots in that one axis (time) is on a linear scale, and the other axis is on a logarithmic scale. However, I am calling these plots "logarithmic plots" for simplicity.
37. 37. See the appendix, "The Law of Accelerating Returns Revisited," which provides a mathematical derivation of why there are two levels of exponential growth (that is, exponential growth over time in which the rate of the exponential growth-the exponent-is itself growing exponentially over time) in computational power as measured by MIPS per unit cost. See the appendix, "The Law of Accelerating Returns Revisited," which provides a mathematical derivation of why there are two levels of exponential growth (that is, exponential growth over time in which the rate of the exponential growth-the exponent-is itself growing exponentially over time) in computational power as measured by MIPS per unit cost.
38. 38. Hans Moravec, "When Will Computer Hardware Match the Human Brain?" Hans Moravec, "When Will Computer Hardware Match the Human Brain?" Journal of Evolution and Technology Journal of Evolution and Technology 1 (1998), http://www.jetpress.org/volumel/moravec.pdf. 1 (1998), http://www.jetpress.org/volumel/moravec.pdf.
39. 39. See note 35 above. See note 35 above.
40. 40. Achieving the first MIPS per $1,000 took from 1900 to 1990. We're now doubling the number of MIPS per $1,000 in about 400 days. Because current priceperformance is about 2,000 MIPS per $1,000, we are adding price-performance at the rate of 5 MIPS per day, or 1 MIPS about every 5 hours. Achieving the first MIPS per $1,000 took from 1900 to 1990. We're now doubling the number of MIPS per $1,000 in about 400 days. Because current priceperformance is about 2,000 MIPS per $1,000, we are adding price-performance at the rate of 5 MIPS per day, or 1 MIPS about every 5 hours.
41. 41. "IBM Details Blue Gene Supercomputer," "IBM Details Blue Gene Supercomputer," CNET News CNET News, May 8, 2003, http://news.com.com/2100-1008_3-1000421.html.
42. 42. See Alfred North Whitehead, See Alfred North Whitehead, An Introduction to Mathematics An Introduction to Mathematics (London: Williams and Norgate, 1911), which he wrote at the same time he and Bertrand Russell were working on their seminal three-volume (London: Williams and Norgate, 1911), which he wrote at the same time he and Bertrand Russell were working on their seminal three-volume Principia Mathematica Principia Mathematica.
43. 43. While originally projected to take fifteen years, "the Human Genome Project was finished two and a half years ahead of time and, at $2.7 billion in FY 1991 dollars, significantly under original spending projections": http://www.ornl.gov/sci/techresources/Human_Genome/project/ 50yr/press4_2003.shtml. While originally projected to take fifteen years, "the Human Genome Project was finished two and a half years ahead of time and, at $2.7 billion in FY 1991 dollars, significantly under original spending projections": http://www.ornl.gov/sci/techresources/Human_Genome/project/ 50yr/press4_2003.shtml.
44. 44. Human Genome Project Information, http://www.ornl.gov/sci/techresources/Human_Genome/project/privatesector.shtml; Stanford Genome Technology Center, http://sequence-www.stanford.edu/group/techdev/auto.html; National Human Genome Research Inst.i.tute, http://www.genome.gov; Tabitha Powledge, "How Many Genomes Are Enough?" Human Genome Project Information, http://www.ornl.gov/sci/techresources/Human_Genome/project/privatesector.shtml; Stanford Genome Technology Center, http://sequence-www.stanford.edu/group/techdev/auto.html; National Human Genome Research Inst.i.tute, http://www.genome.gov; Tabitha Powledge, "How Many Genomes Are Enough?" Scientist Scientist, November 17, 2003, http://www.biomedcentral.com/news/20031117/07.
45. 45. Data from National Center for Biotechnology Information, "GenBank Statistics," revised May 4, 2004, http://www.ncbi.nlm.nih.gov/Genbank/genbankstats.html. Data from National Center for Biotechnology Information, "GenBank Statistics," revised May 4, 2004, http://www.ncbi.nlm.nih.gov/Genbank/genbankstats.html.
46. 46. Severe acute respiratory syndrome (SARS) was sequenced within thirty-one days of the virus being identified by the British Columbia Cancer Agency and the American Centers for Disease Control. The sequencing from the two centers differed by only ten base pairs out of twenty-nine thousand. This work identified SARS as a coronavirus. Dr. Julie Gerberding, director of the CDC, called the quick sequencing "a scientific achievement that I don't think has been paralleled in our history." See K. Philipkoski, "SARS Gene Sequence Unveiled," Severe acute respiratory syndrome (SARS) was sequenced within thirty-one days of the virus being identified by the British Columbia Cancer Agency and the American Centers for Disease Control. The sequencing from the two centers differed by only ten base pairs out of twenty-nine thousand. This work identified SARS as a coronavirus. Dr. Julie Gerberding, director of the CDC, called the quick sequencing "a scientific achievement that I don't think has been paralleled in our history." See K. Philipkoski, "SARS Gene Sequence Unveiled," Wired News Wired News, April 15, 2003, http://www.wired.com/news/medtech/0,1286,58481.00.html?tw=wn_story _related.
In contrast, the efforts to sequence HIV began in the 1980s. HIV 1 and HIV 2 were completely sequenced in 2003 and 2002 respectively. National Center for Biotechnology Information, http://www.ncbi.nlm.nih.gov/genomes/framik.cgi?db=genome&gi=12171; HIV Sequence Database maintained by the Los Alamos National Laboratory, http://www.hiv.lanl.gov/content/hiv-db/HTML/outline.html.
47. 47. Mark Brader, "A Chronology of Digital Computing Machines (to 1952)," http://www.davros.org/misc/chronology.html; Richard E. Matick, Mark Brader, "A Chronology of Digital Computing Machines (to 1952)," http://www.davros.org/misc/chronology.html; Richard E. Matick, Computer Storage Systems and Technology Computer Storage Systems and Technology (New York: John Wiley and Sons, 1977); University of Cambridge Computer Laboratory, EDSAC99, http://www.cl.cam.ac.uk/UoCCL/misc/EDSAC99/statistics.html; Mary Bellis, "Inventors of the Modern Computer: The History of the UNIVAC Computer-J. Presper Eckert and John Mauchly," http://inventors.about.com/library/weekly/aa062398.htm; "Initial Date of Operation of Computing Systems in the USA (19501958)," compiled from 1968 OECD data, http://members.iinet.net.au/~dgreen/timeline.html; Douglas Jones, "Frequently Asked Questions about the DEC PDP-8 computer," ftp://rtfrn.mit.edu/pub/usenet/alt.sys.pdp8/PDP-8_Frequently_Asked_Questions_%28posted_every_other_month%29; (New York: John Wiley and Sons, 1977); University of Cambridge Computer Laboratory, EDSAC99, http://www.cl.cam.ac.uk/UoCCL/misc/EDSAC99/statistics.html; Mary Bellis, "Inventors of the Modern Computer: The History of the UNIVAC Computer-J. Presper Eckert and John Mauchly," http://inventors.about.com/library/weekly/aa062398.htm; "Initial Date of Operation of Computing Systems in the USA (19501958)," compiled from 1968 OECD data, http://members.iinet.net.au/~dgreen/timeline.html; Douglas Jones, "Frequently Asked Questions about the DEC PDP-8 computer," ftp://rtfrn.mit.edu/pub/usenet/alt.sys.pdp8/PDP-8_Frequently_Asked_Questions_%28posted_every_other_month%29; Programmed Data Processor-1 Handbook Programmed Data Processor-1 Handbook, Digital Equipment Corporation (19601963), http://www.dbit.com/greeng3/pdp1/pdp1.html#INTRODUCTION; John Walker, "Typical UNIVAC 1108 Prices: 1968," http://www.fourmilab.ch/doc.u.ments/univac/config1108.html; Jack Harper, "LISP 1.5 for the Univac 1100 Mainframe," http://www.frobenius.com/univac.htm; Wikipedia, "Data General Nova," http://www.answers.com/topic/data-general-nova; Darren Brewer, "Chronology of Personal Computers 19721974," http://uk.geocities.com/magoos_universe/comp1972.htm; www.pricewatch.com; http://www.jc-news.com/pa.r.s.e.cgi?news/pricewatch/raw/pw-010702; http://www.jc-news.com/pa.r.s.e.cgi ?news/pricewatch/raw/pw-020624; http://www.pricewatch.com (11/17/04); http://sharkyextreme.com/guidesIWMPG/article.php/10706_2227191_2; Byte Byte advertis.e.m.e.nts, September 1975March 1998; advertis.e.m.e.nts, September 1975March 1998; PC Computing PC Computing advertis.e.m.e.nts, March 1977April 2000. advertis.e.m.e.nts, March 1977April 2000.
48. 48. Seagate, "Products," .http://www.seagate.com/cda/products/discsales/index; Seagate, "Products," .http://www.seagate.com/cda/products/discsales/index; Byte Byte advertis.e.m.e.nts, 19771998; advertis.e.m.e.nts, 19771998; PC Computing PC Computing advertis.e.m.e.nts, March 1999; Editors of Time-Life Books, advertis.e.m.e.nts, March 1999; Editors of Time-Life Books, Understanding Computers: Memory and Storage Understanding Computers: Memory and Storage, rev. ed. (New York: Warner Books, 1990); "Historical Notes about the Cost of Hard Drive Storage s.p.a.ce," http://www.alts.net/ns1625/winchest.html; "IBM 305 RAMAC Computer with Disk Drive, "http://www.cedmagic.com/history/ibm-305-ramac.html; John C. McCallum, "Disk Drive Prices (19552004)," http://www.jcmit.com/diskprice.htm.
49. 49. James DeRose, James DeRose, The Wireless Data Handbook The Wireless Data Handbook (St. Johnsbury, Vt.: Quantrum, 1996); First Mile Wireless, http://www.firstmilewireless.coml; J.B. Miles, "Wireless LANs," (St. Johnsbury, Vt.: Quantrum, 1996); First Mile Wireless, http://www.firstmilewireless.coml; J.B. Miles, "Wireless LANs," Government Computer News Government Computer News 18.28 (April 30, 1999), http://www.gcn.com/vo118_no28/guide/514-1.html; 18.28 (April 30, 1999), http://www.gcn.com/vo118_no28/guide/514-1.html; Wireless Week Wireless Week (April 14, 1997), http://www.wirelessweek.com/toc/4%2F14%2F1997; Office of Technology a.s.sessment, "Wireless Technologies and the National Information Infrastructure," September 1995, http://infoventures.com/emf/federal/ota/ota95-tc.html; Signal Lake, "Broadband Wireless Network Economics Update," January 14, 2003, http://www.signallake.com/publications/broadbandupdate.pdf; BridgeWave Communications communication, http://www.bridgewave.com/050604.htm. (April 14, 1997), http://www.wirelessweek.com/toc/4%2F14%2F1997; Office of Technology a.s.sessment, "Wireless Technologies and the National Information Infrastructure," September 1995, http://infoventures.com/emf/federal/ota/ota95-tc.html; Signal Lake, "Broadband Wireless Network Economics Update," January 14, 2003, http://www.signallake.com/publications/broadbandupdate.pdf; BridgeWave Communications communication, http://www.bridgewave.com/050604.htm.
50. 50. Internet Software Consortium (http://www.isc.org), ISC Domain Survey: Number of Internet Hosts, http://www.isc.org/ds/host-count-history.html. Internet Software Consortium (http://www.isc.org), ISC Domain Survey: Number of Internet Hosts, http://www.isc.org/ds/host-count-history.html.
51. 51. Ibid. Ibid.
52. 52. Average traffic on Internet backbones in the U.S. during December of each year is used to estimate traffic for the year. A. M. Odlyzko, "Internet Traffic Growth: Sources and Implications," Average traffic on Internet backbones in the U.S. during December of each year is used to estimate traffic for the year. A. M. Odlyzko, "Internet Traffic Growth: Sources and Implications," Optical Transmission Systems and Equipment for WDM Networking II Optical Transmission Systems and Equipment for WDM Networking II, B. B. Dingel, W. Weiershausen, A. K. Dutta, and K.-I. Sato, eds., Proc. SPIE Proc. SPIE (The International Society for Optical Engineering) 5247 (2003): 115, http://www.dtc.umn.edu/~odlyzko/doc/oft.internet.growth.pdf; data for 20032004 values: e-mail correspondence with A. M. Odlyzko. (The International Society for Optical Engineering) 5247 (2003): 115, http://www.dtc.umn.edu/~odlyzko/doc/oft.internet.growth.pdf; data for 20032004 values: e-mail correspondence with A. M. Odlyzko.
53. 53. Dave Kristula, "The History of the Internet" (March 1997, update August 2001), http://www.davesite.com/webstation/net-history.shtml; Robert Zakon, "Hobbes' Internet Timeline v8.0," http://www.zakon.org/robert/internet/timeline; Dave Kristula, "The History of the Internet" (March 1997, update August 2001), http://www.davesite.com/webstation/net-history.shtml; Robert Zakon, "Hobbes' Internet Timeline v8.0," http://www.zakon.org/robert/internet/timeline; Converge Network Digest Converge Network Digest, December 5, 2002, http://www.convergedigest.com/Daily/daily.asp?vn=v9n229&fecha=December%2005,%202002; V. Cerf, "Cerf's Up," 2004, http://global.mci.com/de/resources/cerfs_up/.
54. 54. H. C. Nathanson et al., "The Resonant Gate Transistor," H. C. Nathanson et al., "The Resonant Gate Transistor," IEEE Transactions on Electron Devices IEEE Transactions on Electron Devices 14.3 (March 1967): 11733; Larry J. Hornbeck, "128 x 128 Deformable Mirror Device," 14.3 (March 1967): 11733; Larry J. Hornbeck, "128 x 128 Deformable Mirror Device," IEEE Transactions on Electron Devices IEEE Transactions on Electron Devices 30.5 (April 1983): 53943; J. Storrs Hall, "Nanocomputers and Reversible Logic," 30.5 (April 1983): 53943; J. Storrs Hall, "Nanocomputers and Reversible Logic," Nanotechnology Nanotechnology 5 (July 1994): 15767; V.V.Aristov et al., "A New Approach to Fabrication of Nanostructures," 5 (July 1994): 15767; V.V.Aristov et al., "A New Approach to Fabrication of Nanostructures," Nanotechnology Nanotechnology 6 (April 1995): 3539; C. Montemagno et al., "Constructing Biological Motor Powered Nanomechanical Devices," 6 (April 1995): 3539; C. Montemagno et al., "Constructing Biological Motor Powered Nanomechanical Devices," Nanotechnology Nanotechnology 10 (1999): 22531, http://www.foresight.org/Conferences/MNT6/Papers/Montemagno/; Celeste Biever, "Tiny 'Elevator' Most Complex Nanomachine Yet," 10 (1999): 22531, http://www.foresight.org/Conferences/MNT6/Papers/Montemagno/; Celeste Biever, "Tiny 'Elevator' Most Complex Nanomachine Yet," NewScientist.com News Service NewScientist.com News Service, March 18, 2004, http://www.newscientist.com/article.ns?id=dn4794.
55. 55. ETC Group, "From Genomes to Atoms: The Big Down," p. 39, http://www.etcgroup.org/doc.u.ments/TheBigDown. pdf. ETC Group, "From Genomes to Atoms: The Big Down," p. 39, http://www.etcgroup.org/doc.u.ments/TheBigDown. pdf.
56. 56. Ibid., p. 41. Ibid., p. 41.
57. 57. Although it is not possible to determine precisely the information content in the genome, because of the repeated base pairs it is clearly much less than the total uncompressed data. Here are two approaches to estimating the compressed information content of the genome, both of which demonstrate that a range of thirty to one hundred million bytes is conservatively high. Although it is not possible to determine precisely the information content in the genome, because of the repeated base pairs it is clearly much less than the total uncompressed data. Here are two approaches to estimating the compressed information content of the genome, both of which demonstrate that a range of thirty to one hundred million bytes is conservatively high.
1. 1. In terms of the uncompressed data, there are three billion DNA rungs in the human genetic code, each coding two bits (since there are four possibilities for each DNA base pair). Thus, the human genome is about 800 million bytes uncompressed. The noncoding DNA used to be called "junk DNA," but it is now clear that it plays an important role in gene expression. However, it is very inefficiently coded. For one thing, there are ma.s.sive redundancies (for example, the sequence called "ALU" is repeated hundreds of thousands of times), which compression algorithms can take advantage of. In terms of the uncompressed data, there are three billion DNA rungs in the human genetic code, each coding two bits (since there are four possibilities for each DNA base pair). Thus, the human genome is about 800 million bytes uncompressed. The noncoding DNA used to be called "junk DNA," but it is now clear that it plays an important role in gene expression. However, it is very inefficiently coded. For one thing, there are ma.s.sive redundancies (for example, the sequence called "ALU" is repeated hundreds of thousands of times), which compression algorithms can take advantage of.
With the recent explosion of genetic data banks, there is a great deal of interest in compressing genetic data. Recent work on applying standard data compression algorithms to genetic data indicates that reducing the data by 90 percent (for bit-perfect compression) is feasible: Hisahiko Sato et al., "DNA Data Compression in the Post Genome Era," Genome Informatics Genome Informatics 12 (2001): 51214, http://www.jsbi.org/journal/GIW01/GIW01P130.pdf. 12 (2001): 51214, http://www.jsbi.org/journal/GIW01/GIW01P130.pdf.Thus we can compress the genome to about 80 million bytes without loss of information (meaning we can perfectly reconstruct the full 800-million-byte uncompressed genome).Now consider that more than 98 percent of the genome does not code for proteins. Even after standard data compression (which eliminates redundancies and uses a dictionary lookup for common sequences), the algorithmic content of the noncoding regions appears to be rather low, meaning that it is likely that we could code an algorithm that would perform the same function with fewer bits. However, since we are still early in the process of reverse engineering the genome, we cannot make a reliable estimate of this further decrease based on a functionally equivalent algorithm. I am using, therefore, a range of 30 to 100 million bytes of compressed information in the genome. The top part of this range a.s.sumes only data compression and no algorithmic simplification.Only a portion (although the majority) of this information characterizes the design of the brain.
2. 2. Another line of reasoning is as follows. Though the human genome contains around 3 billion bases, only a small percentage, as mentioned above, codes for proteins. By current estimates, there are 26,000 genes that code for proteins. If we a.s.sume those genes average 3,000 bases of useful data, those equal only approximately 78 million bases. A base of DNA requires only two bits, which translate to about 20 million bytes (78 million bases divided by four). In the protein-coding sequence of a gene, each "word" (codon) of three DNA bases translates into one amino acid. There are, therefore, 4 Another line of reasoning is as follows. Though the human genome contains around 3 billion bases, only a small percentage, as mentioned above, codes for proteins. By current estimates, there are 26,000 genes that code for proteins. If we a.s.sume those genes average 3,000 bases of useful data, those equal only approximately 78 million bases. A base of DNA requires only two bits, which translate to about 20 million bytes (78 million bases divided by four). In the protein-coding sequence of a gene, each "word" (codon) of three DNA bases translates into one amino acid. There are, therefore, 43 (64) possible codon codes, each consisting of three DNA bases. There are, however, only 20 amino acids used plus a stop codon (null amino acid) out of the 64. The rest of the 4 (64) possible codon codes, each consisting of three DNA bases. There are, however, only 20 amino acids used plus a stop codon (null amino acid) out of the 64. The rest of the 43 codes are used as synonyms of the 21 useful ones. Whereas 6 bits are required to code for 64 possible combinations, only about 4.4 (log codes are used as synonyms of the 21 useful ones. Whereas 6 bits are required to code for 64 possible combinations, only about 4.4 (log221) bits are required to code for 21 possibilities, a savings of 1.6 out of 6 bits (about 27 percent), bringing us down to about 15 million bytes. In addition, some standard compression based on repeating sequences is feasible here, although much less compression is possible on this protein-coding portion of the DNA than in the so-called junk DNA, which has ma.s.sive redundancies. this will bring the figure probably below 12 million bytes. However, now we have to add information for the noncoding portion of the DNA that controls gene expression. Although this portion of the DNA comprises the bulk of the genome, it appears to have a low level of information content and is replete with ma.s.sive redundancies. Estimating that it matches the approximately 12 million bytes of protein-coding DNA, we again come to approximately 24 million bytes. From this perspective, an estimate of 30 to 100 million bytes is conservatively high.
58. 58. Continuous values can be represented by floating-point numbers to any desired degree of accuracy. A floating-point number consists of two sequences of bits. One "exponent" sequence represents a power of 2. The "base" sequence represents a fraction of 1. By increasing the number of bits in the base, any desired degree of accuracy can be achieved. Continuous values can be represented by floating-point numbers to any desired degree of accuracy. A floating-point number consists of two sequences of bits. One "exponent" sequence represents a power of 2. The "base" sequence represents a fraction of 1. By increasing the number of bits in the base, any desired degree of accuracy can be achieved.
59. 59. Stephen Wolfram, Stephen Wolfram, A New Kind of Science A New Kind of Science (Champaign, Ill.: Wolfram Media, 2002). (Champaign, Ill.: Wolfram Media, 2002).
60. 60. Early work on a digital theory of physics was also presented by Frederick W. Kantor, Early work on a digital theory of physics was also presented by Frederick W. Kantor, Information Mechanics Information Mechanics (New York: John Wiley and Sons, 1977). Links to several of Kantor's papers can be found at http://w3.execnet.com/kantor/pm00.htm (l997); http://w3.execnet.com/kantor/1b2p.htm (l989); and http://w3.execnet.com/kantor/ipoim.htm (l982). Also see at http://www.kx.com/listbox/k/msg05621.html. (New York: John Wiley and Sons, 1977). Links to several of Kantor's papers can be found at http://w3.execnet.com/kantor/pm00.htm (l997); http://w3.execnet.com/kantor/1b2p.htm (l989); and http://w3.execnet.com/kantor/ipoim.htm (l982). Also see at http://www.kx.com/listbox/k/msg05621.html.
61. 61. Konrad Zuse, "Rechnender Raum," Konrad Zuse, "Rechnender Raum," Elektronische Datenverarbeitung Elektronische Datenverarbeitung, 1967, vol. 8, pp. 33644. Konrad Zuse's book on a cellular automaton-based universe was published two years later: Rechnender Raum, Schriften zur Datenverarbeitung Rechnender Raum, Schriften zur Datenverarbeitung (Braunschweig, Germany: Friedrich Vieweg & Sohn, 1969). English translation: (Braunschweig, Germany: Friedrich Vieweg & Sohn, 1969). English translation: Calculating s.p.a.ce Calculating s.p.a.ce, MIT Technical Translation AZT-70-164-GEMIT, February 1970.MIT Project MAC, Cambridge, MA 02139. PDF.
62. 62. Edward Fredkin quoted in Robert Wright, "Did the Universe Just Happen?" Edward Fredkin quoted in Robert Wright, "Did the Universe Just Happen?" Atlantic Monthly Atlantic Monthly, April 1988, 2944, http://digitalphysics.org/Publications/Wri88a/html.
63. 63. Ibid. Ibid.
64. 64. Many of Fredkin's results come from studying his own model of computation, which explicitly reflects a number of fundamental principles of physics. See the cla.s.sic article Edward Fredkin and Tommaso Toffoli, "Conservative Logic," Many of Fredkin's results come from studying his own model of computation, which explicitly reflects a number of fundamental principles of physics. See the cla.s.sic article Edward Fredkin and Tommaso Toffoli, "Conservative Logic," International Journal of Theoretical Physics International Journal of Theoretical Physics 21.34 (l982): 21953, http://www.digitalphilosophy.org/download_doc.u.ments/ConservativeLogic.pdf. Also, a set of concerns about the physics of computation a.n.a.lytically similar to those of Fredkin's may be found in Norman Margolus, "Physics and Computation," Ph.D. thesis, MIT/LCS/TR-415, MIT Laboratory for Computer Science, 1988. 21.34 (l982): 21953, http://www.digitalphilosophy.org/download_doc.u.ments/ConservativeLogic.pdf. Also, a set of concerns about the physics of computation a.n.a.lytically similar to those of Fredkin's may be found in Norman Margolus, "Physics and Computation," Ph.D. thesis, MIT/LCS/TR-415, MIT Laboratory for Computer Science, 1988.
65. 65. I discussed Norbert Wiener and Ed Fredkin's view of information as the fundamental building block for physics and other levels of reality in my 1990 book, I discussed Norbert Wiener and Ed Fredkin's view of information as the fundamental building block for physics and other levels of reality in my 1990 book, The Age of Intelligent Machines The Age of Intelligent Machines.
The complexity of casting all of physics in terms of computational transformations proved to be an immensely challenging project, but Fredkin has continued his efforts. Wolfram has devoted a considerable portion of his work over the past decade to this notion, apparently with only limited communication with some of the others in the physics community who are also pursuing the idea. Wolfram's stated goal "is not to present a specific ultimate model for physics," but in his "Note for Physicists" (which essentially equates to a grand challenge), Wolfram describes the "features that [he] believe[s] such a model will have" (A New Kind of Science, pp. 104365, http://www.wolframscience.com/nksonline/page-1043c-text).In The Age of Intelligent Machines The Age of Intelligent Machines, I discuss "the question of whether the ultimate nature of reality is a.n.a.log or digital" and point out that "as we delve deeper and deeper into both natural and artificial processes, we find the nature of the process often alternates between a.n.a.log and digital representations of information." As an ill.u.s.tration, I discussed sound. In our brains, music is represented as the digital firing of neurons in the cochlea, representing different frequency bands. In the air and in the wires leading to loudspeakers, it is an a.n.a.log phenomenon. The representation of sound on a compact disc is digital, which is interpreted by digital circuits. But the digital circuits consist of thresholded transistors, which are a.n.a.log amplifiers. As amplifiers, the transistors manipulate individual electrons, which can be counted and are, therefore, digital, but at a deeper level electrons are subject to a.n.a.log quantum-field equations. At a yet deeper level, Fredkin and now Wolfram are theorizing a digital (computational) basis to these continuous equations.It should be further noted that if someone actually does succeed in establis.h.i.+ng such a digital theory of physics, we would then be tempted to examine what sorts of deeper mechanisms are actually implementing the computations and links of the cellular automata. Perhaps underlying the cellular automata that run the universe are yet more basic a.n.a.log phenomena, which, like transistors, are subject to thresholds that enable them to perform digital transactions. Thus, establis.h.i.+ng a digital basis for physics will not settle the philosophical debate as to whether reality is ultimately digital or a.n.a.log. Nonetheless, establis.h.i.+ng a viable computational model of physics would be a major accomplishment.So how likely is this? We can easily establish an existence proof that a digital model of physics is feasible, in that continuous equations can always be expressed to any desired level of accuracy in the form of discrete transformations on discrete changes in value. That is, after all, the basis for the fundamental theorem of calculus. However, expressing continuous formulas in this way is an inherent complication and would violate Einstein's dictum to express things "as simply as possible, but no simpler." So the real question is whether we can express the basic relations.h.i.+ps that we are aware of in more elegant terms, using cellular-automata algorithms. One test of a new theory of physics is whether it is capable of making verifiable predictions. In at least one important way, that might be a difficult challenge for a cellular automata-based theory because lack of predictability is one of the fundamental features of cellular automata.Wolfram starts by describing the universe as a large network of nodes. The nodes do not exist in "s.p.a.ce," but rather s.p.a.ce, as we perceive it, is an illusion created by the smooth transition of phenomena through the network of nodes. One can easily imagine building such a network to represent "naive" (Newtonian) physics by simply building a three-dimensional network to any desired degree of granularity. Phenomena such as "particles" and "waves" that appear to move through s.p.a.ce would be represented by "cellular gliders," which are patterns that are advanced through the network for each cycle of computation. Fans of the game Life Life (which is based on cellular automata) will recognize the common phenomenon of gliders and the diversity of patterns that can move smoothly through a cellular-automaton network. The speed of light, then, is the result of the clock speed of the celestial computer, since gliders can advance only one cell per computational cycle. (which is based on cellular automata) will recognize the common phenomenon of gliders and the diversity of patterns that can move smoothly through a cellular-automaton network. The speed of light, then, is the result of the clock speed of the celestial computer, since gliders can advance only one cell per computational cycle.Einstein's general relativity, which describes gravity as perturbations in s.p.a.ce itself, as if our three-dimensional world were curved in some unseen fourth dimension, is also straightforward to represent in this scheme. We can imagine a four-dimensional network and can represent apparent curvatures in s.p.a.ce in the same way that one represents normal curvatures in three-dimensional s.p.a.ce. Alternatively, the network can become denser in certain regions to represent the equivalent of such curvature.A cellular-automata conception proves useful in explaining the apparent increase in entropy (disorder) that is implied by the second law of thermodynamics. We have to a.s.sume that the cellular-automata rule underlying the universe is a cla.s.s 4 rule (see main text)-otherwise the universe would be a dull place indeed. Wolfram's primary observation that a cla.s.s 4 cellular automaton quickly produces apparent randomness (despite its determinate process) is consistent with the tendency toward randomness that we see in Brownian motion and that is implied by the second law.Special relativity is more difficult. There is an easy mapping from the Newtonian model to the cellular network. But the Newtonian model breaks down in special relativity. In the Newtonian world, if a train is going eighty miles per hour and you drive along it on a parallel road at sixty miles per hour, the train will appear to pull away from you at twenty miles per hour. But in the world of special relativity, if you leave Earth at three quarters of the speed of light, light will still appear to you to move away from you at the full speed of light. In accordance with this apparently paradoxical perspective, both the size and subjective pa.s.sage of time for two observers will vary depending on their relative speed. Thus, our fixed mapping of s.p.a.ce and nodes becomes considerably more complex. Essentially, each observer needs his or her own network. However, in considering special relativity, we can essentially apply the same conversion to our "Newtonian" network as we do to Newtonian s.p.a.ce. However, it is not clear that we are achieving greater simplicity in representing special relativity in this way.A cellular-node representation of reality may have its greatest benefit in NOTES 521 understanding some aspects of the phenomenon of quantum mechanics. It could provide an explanation for the apparent randomness that we find in quantum phenomena. Consider, for example, the sudden and apparently random creation of particle-antiparticle pairs. The randomness could be the same sort of randomness that we see in cla.s.s 4 cellular automata. Although predetermined, the behavior of cla.s.s 4 automata cannot be antic.i.p.ated (other than by running the cellular automata) and is effectively random.This is not a new view. It's equivalent to the "hidden variables" formulation of quantum mechanics, which states that there are some variables that we cannot otherwise access that control what appears to be random behavior that we can observe. The hidden-variables conception of quantum mechanics is not inconsistent with the formulas for quantum mechanics. It is possible but is not popular with quantum physicists because it requires a large number.of a.s.sumptions to work out in a very particular way. However, I do not view this as a good argument against it. The existence of our universe is itself very unlikely and requires many a.s.sumptions to all work out in a very precise way.Yethere we are.A bigger question is, How could a hidden-variables theory be tested? If based on cellular-automata-like processes, the hidden variables would be inherently unpredictable, even if deterministic. We would have to find some other way to "unhide" the hidden variables.Wolfram's network conception of the universe provides a potential perspective on the phenomenon of quantum entanglement and the collapse of the wave function. The collapse of the wave function, which renders apparently ambiguous properties of a particle (for example, its location) retroactively determined, can be viewed from the cellular-network perspective as the interaction of the observed phenomenon with the observer itself. As observers, we are not outside the network but exist inside it. We know from cellular mechanics that two ent.i.ties cannot interact without both being changed, which suggests a basis for wave-function collapse.Wolfram writes, "If the universe is a network, then it can in a sense easily contain threads that continue to connect particles even when the particles get far apart in terms of ordinary s.p.a.ce." This could provide an explanation for recent dramatic experiments showing nonlocality of action in which two "quantum entangled" particles appear to continue to act in concert with each other even though separated by large distances. Einstein called this "spooky action at a distance" and rejected it, although recent experiments appear to confirm it.Some phenomena fit more neatly into this cellular automata-network conception than others. Some of the suggestions appear elegant, but as Wolfram's "Note for Physicists" makes clear, the task of translating all of physics into a consistent cellular-automata-based system is daunting indeed.Extending his discussion to philosophy, Wolfram "explains" the apparent phenomenon of free will as decisions that are determined but unpredictable. Since 522 NOTES I: 1'.IIIf there is no way to predict the outcome of a cellular process without actually running the process, and since no simulator could possibly run faster than the universe itself, there is therefore no way to reliably predict human decisions. So even though our decisions are determined, there is no way to preidentify what they will be. However, this is not a fully satisfactory examination of the concept. This observation concerning the lack of predictability can be made for the outcome of most physical processes-such as where a piece of dust will fall on the ground. This view thereby equates human free will with the random descent of a piece of dust. Indeed, that appears to be Wolfram's view when he states that the process in the human brain is "computationally equivalent" to those taking place in processes such as fluid turbulence.Some of the phenomena in nature (for example, clouds, coastlines) are characterized by repet.i.tive simple processes such as cellular automata and fractals, but intelligent patterns (such as the human brain) require an evolutionary process (or alternatively, the reverse engineering of the results of such a process). Intelligence is the inspired product of evolution and is also, in my view, the most powerful "force" in the world, ultimately transcending the powers of mindless natural forces.In summary, Wolfram's sweeping and ambitious treatise paints a compelling but ultimately overstated and incomplete picture. Wolfram joins a growing community of voices that maintain that patterns of information, rather than matter and energy, represent the more fundamental building blocks of reality. Wolfram has added to our knowledge of how patterns of information create the world we experience, and I look forward to a period of collaboration between Wolfram and his colleagues so that we can build a more robust vision of the ubiquitous role of algorithms in the world.The lack of predictability of cla.s.s 4 cellular automata underlies at least some of the apparent complexity of biological systems and does represent one of the important biological paradigms that we can seek to emulate in our technology. It does not explain all of biology. It remains at least possible, however, that such methods can explain all of physics. If Wolfram, or anyone else for that matter, succeeds in formulating physics in terms of cellular-automata operations and their patterns, Wolfram's book will have earned its t.i.tle. In any event, I believe the book to be an important work of ontology.
66. 66. Rule 110 states that a cell becomes white if its previous color was, and its two neighbors are, all black or all white, or if its previous color was white and the two neighbors are black and white, respectively; otherwise, the cell becomes black. Rule 110 states that a cell becomes white if its previous color was, and its two neighbors are, all black or all white, or if its previous color was white and the two neighbors are black and white, respectively; otherwise, the cell becomes black.
67. 67. Wolfram, Wolfram, New Kind of Science New Kind of Science, p. 4, http://www.wolframscience.com/nksonline/page-4-text.
68. 68. Note that certain interpretations of quantum mechanics imply that the world is not based on deterministic rules and that there is an inherent quantum randomness to every interaction at the (small) quantum scale of physical reality. Note that certain interpretations of quantum mechanics imply that the world is not based on deterministic rules and that there is an inherent quantum randomness to every interaction at the (small) quantum scale of physical reality.
69. 69. As discussed in note 57 above, the uncompressed genome has about six billion bits of information (order of magnitude = 10 As discussed in note 57 above, the uncompressed genome has about six billion bits of information (order of magnitude = 1010 bits), and the compressed genome is about 30 to 100 million bytes. Some of this design information applies, of course, to other organs. Even a.s.suming all of 100 million bytes applies to the brain, we get a conservatively high figure of 10 bits), and the compressed genome is about 30 to 100 million bytes. Some of this design information applies, of course, to other organs. Even a.s.suming all of 100 million bytes applies to the brain, we get a conservatively high figure of 109 bits for the design of the brain in the genome. In chapter 3, I discuss an estimate for "human memory on the level of individual interneuronal connections," including "the connection patterns and neurotransmitter concentrations" of 10 bits for the design of the brain in the genome. In chapter 3, I discuss an estimate for "human memory on the level of individual interneuronal connections," including "the connection patterns and neurotransmitter concentrations" of 1018 (billion billion) bits in a mature brain. This is about a billion (10 (billion billion) bits in a mature brain. This is about a billion (109) times more information than that in the genome which describes the brain's design. This increase comes about from the self-organization of the brain as it interacts with the person's environment.
70. 70. See the sections "Disdisorder" and "The Law of Increasing Entropy Versus the Growth of Order" in my book See the sections "Disdisorder" and "The Law of Increasing Entropy Versus the Growth of Order" in my book The Age of Spiritual Machines: When Computers Exceed Human Intelligence The Age of Spiritual Machines: When Computers Exceed Human Intelligence (New York: Viking, 1999), pp. 3033. (New York: Viking, 1999), pp. 3033.
71. 71. A universal computer can accept as input the definition of any other computer and then simulate that other computer. This does not address the speed of simulation, which might be relatively slow. A universal computer can accept as input the definition of any other computer and then simulate that other computer. This does not address the speed of simulation, which might be relatively slow.
72. 72. C. Geoffrey Woods, "Crossing the Midline," C. Geoffrey Woods, "Crossing the Midline," Science Science 304.5676 (June 4, 2004): 145556; Stephen Matthews, "Early Programming of the Hypothalamo-Pituitary-Adrenal Axis," 304.5676 (June 4, 2004): 145556; Stephen Matthews, "Early Programming of the Hypothalamo-Pituitary-Adrenal Axis," Trends in Endocrinology and Metabolism Trends in Endocrinology and Metabolism 13.9 (November 1, 2002): 37380; Justin Crowley and Lawrence Katz, "Early Development of Ocular Dominance Columns," 13.9 (November 1, 2002): 37380; Justin Crowley and Lawrence Katz, "Early Development of Ocular Dominance Columns," Science Science 290.5495 (November 17, 2000): 132124; Anna Penn et al., "Compet.i.tion in the Retinogenicu1ate Patterning Driven by Spontaneous Activity," 290.5495 (November 17, 2000): 132124; Anna Penn et al., "Compet.i.tion in the Retinogenicu1ate Patterning Driven by Spontaneous Activity," Science Science 279.5359 (March 27, 1998): 210812. 279.5359 (March 27, 1998): 210812.
73. 73. The seven commands of a Turing machine are: (1) Read Tape, (2) Move Tape Left, (3) Move Tape Right, (4) Write 0 on the Tape, (5) Write 1 on the Tape, (6) Jump to Another Command, and (7) Halt. The seven commands of a Turing machine are: (1) Read Tape, (2) Move Tape Left, (3) Move Tape Right, (4) Write 0 on the Tape, (5) Write 1 on the Tape, (6) Jump to Another Command, and (7) Halt.
74. 74. In what is perhaps the most impressive a.n.a.lysis in his book, Wolfram shows how a Turing machine with only two states and five possible colors can be a universal Turing machine. For forty years, we've thought that a universal Turing machine had to be more complex than this. Also impressive is Wolfram's demonstration that rule 110 is capable of universal computation, given the right software. Of course, universal computation by itself cannot perform useful tasks without appropriate software. In what is perhaps the most impressive a.n.a.lysis in his book, Wolfram shows how a Turing machine with only two states and five possible colors can be a universal Turing machine. For forty years, we've thought that a universal Turing machine had to be more complex than this. Also impressive is Wolfram's demonstration that rule 110 is capable of universal computation, given the right software. Of course, universal computation by itself cannot perform useful tasks without appropriate software.
75. 75. The "nor" gate transforms two inputs into one output. The output of "nor" is true if and only if neither A The "nor" gate transforms two inputs into one output. The output of "nor" is true if and only if neither A nor nor B is true. B is true.
76. 76. See the section "A See the section "A nor nor B: The Basis of Intelligence?" in B: The Basis of Intelligence?" in The Age of Intelligent Machines The Age of Intelligent Machines (Cambridge, Ma.s.s.: MIT Press, 1990), pp. 15257, http://www.KurzweilAl.net/meme/frame.html?m=12. (Cambridge, Ma.s.s.: MIT Press, 1990), pp. 15257, http://www.KurzweilAl.net/meme/frame.html?m=12.
77. 77. United Nations Economic and Social Commission for Asia and the Pacific, "Regional Road Map Towards an Information Society in Asia and the Pacific," ST/ESCAP/2283, http://www.unescap.org/publications/detail.asp?id=771; Economic and Social Commission for Western Asia, "Regional Profile of the Information Society in Western Asia," October 8, 2003, http://www.escwa.org.lb/information/publications/ictd/docs/ictd-03-11-e.pdf; John Enger, "Asia in the Global Information Economy: The Rise of Region-States, The Role of Telecommunications," presentation at the International Conference on Satellite and Cable Television in Chinese and Asian Regions, Communication Arts Research Inst.i.tute, Fu Ien Catholic University, June 46, 1996. United Nations Economic and Social Commission for Asia and the Pacific, "Regional Road Map Towards an Information Society in Asia and the Pacific," ST/ESCAP/2283, http://www.unescap.org/publications/detail.asp?id=771; Economic and Social Commission for Western Asia, "Regional Profile of the Information Society in Western Asia," October 8, 2003, http://www.escwa.org.lb/information/publications/ictd/docs/ictd-03-11-e.pdf; John Enger, "Asia in the Global Information Economy: The Rise of Region-States, The Role of Telecommunications," presentation at the International Conference on Satellite and Cable Television in Chinese and Asian Regions, Communication Arts Research Inst.i.tute, Fu Ien Catholic University, June 46, 1996.
78. 78. See "The 3 by 5 Initiative," Fact Sheet 274, December 2003, http://www.who.int/mediacentre/factsheets/2003/fs274/en/print.html. See "The 3 by 5 Initiative," Fact Sheet 274, December 2003, http://www.who.int/mediacentre/factsheets/2003/fs274/en/print.html.
79. 79. Technology investments accounted for 76 percent of 1998 venture-capital investments ($10.1 billion) (PricewaterhouseCoopers news release, "Venture Capital Investments Rise 24 Percent and Set Record at $14.7 Billion, PricewaterhouseCoopers Finds," February 16, 1999). In 1999, technology-based companies cornered 90 percent of venture-capital investments ($32 billion) (PricewaterhouseCoopers news release, "Venture Funding Explosion Continues: Annual and Quarterly Investment Records Smashed, According to PricewaterhouseCoopers Money Tree National Survey," February 14, 2000). Venture-capital levels certainly dropped during the high-tech recession; but in just the second quarter of 2003, software companies alone attracted close to $1 billion (PricewaterhouseCoopers news release, "Venture Capital Investments Stabilize in Q2 2003," July 29, 2003). In 1974 in all U.S. manufacturing industries forty-two firms received a total of $26.4 million in venture-capital disburs.e.m.e.nts (in 1974 dollars, or $81 million in 1992 dollars). Samuel Kortum and Josh Lerner, "a.s.sessing the Contribution of Venture Capital to Innovation," Technology investments accounted for 76 percent of 1998 venture-capital investments ($10.1 billion) (PricewaterhouseCoopers news release, "Venture Capital Investments Rise 24 Percent and Set Record at $14.7 Billion, PricewaterhouseCoopers Finds," February 16, 1999). In 1999, technology-based companies cornered 90 percent of venture-capital investments ($32 billion) (PricewaterhouseCoopers news release, "Venture Funding Explosion Continues: Annual and Quarterly Investment Records Smashed, According to PricewaterhouseCoopers Money Tree National Survey," February 14, 2000). Venture-capital levels certainly dropped during the high-tech recession; but in just the second quarter of 2003, software companies alone attracted close to $1 billion (PricewaterhouseCoopers news release, "Venture Capital Investments Stabilize in Q2 2003," July 29, 2003). In 1974 in all U.S. manufacturing industries forty-two firms received a total of $26.4 million in venture-capital disburs.e.m.e.nts (in 1974 dollars, or $81 million in 1992 dollars). Samuel Kortum and Josh Lerner, "a.s.sessing the Contribution of Venture Capital to Innovation," RAND Journal of Economics RAND Journal of Economics 31.4 (Winter 2000): 67492, http://econ.bu.edu/kortum/rje_Winter'00_Kortum.pdf. As Paul Gompers and Josh Lerner say, "Inflows to venture capital funds have expanded from virtually zero in the mid-1970s....," Gompers and Lerner, 31.4 (Winter 2000): 67492, http://econ.bu.edu/kortum/rje_Winter'00_Kortum.pdf. As Paul Gompers and Josh Lerner say, "Inflows to venture capital funds have expanded from virtually zero in the mid-1970s....," Gompers and Lerner, The Venture Capital Cycle The Venture Capital Cycle, (Cambridge, Ma.s.s.: MIT Press, 1999). See also Paul Gompers, "Venture Capital," in B. Espen Eckbo, ed., Handbook of Corporate Finance: Empirical Corporate Finance Handbook of Corporate Finance: Empirical Corporate Finance, in the Handbooks in Finance series (Holland: Elsevier, forthcoming), chapter 11, 2005, http://mba.tuck.dartmouth.edu/pages/faculty/espen.eckbo/PDFs/Handbookpdf/CH11-VentureCapital.pdf.
80. 80. An account of how "new economy" technologies are making important transformations to "old economy" industries: Jonathan Rauch, "The New Old Economy: Oil, Computers, and the Reinvention of the Earth," An account of how "new economy" technologies are making important transformations to "old economy" industries: Jonathan Rauch, "The New Old Economy: Oil, Computers, and the Reinvention of the Earth," Atlantic Monthly Atlantic Monthly, January 3, 2001.
81. 81. U.S. Department of Commerce, Bureau of Economic a.n.a.lysis (http://www.bea.doc.gov), use the following site and select Table 1.1.6: http://www.bea.doc.gov/bealdn/nipaweb/SelectTable.asp?Selected=N. U.S. Department of Commerce, Bureau of Economic a.n.a.lysis (http://www.bea.doc.gov), use the following site and select Table 1.1.6: http://www.bea.doc.gov/bealdn/nipaweb/SelectTable.asp?Selected=N.
82. 82. U.S. Department of Commerce, Bureau of Economic a.n.a.lysis, http://www.bea.doc.gov. Data for 19201999: Population Estimates Program, Population Division, U.S. Census Bureau, "Historical National Population Estimates: July 1, 1900 to July I, 1999," http://www.census.gov/popest/archivesl1990s/popdockest.txt; data for 20002004: http://www.census.gov/popest/states/tables/NST-EST2004-01.pdf U.S. Department of Commerce, Bureau of Economic a.n.a.lysis, http://www.bea.doc.gov. Data for 19201999: Population Estimates Program, Population Division, U.S. Census Bureau, "Historical National Population Estimates: July 1, 1900 to July I, 1999," http://www.census.gov/popest/archivesl1990s/popdockest.txt; data for 20002004: http://www.census.gov/popest/states/tables/NST-EST2004-01.pdf 83. 83. "The Global Economy: From Recovery to Expansion," Results from "The Global Economy: From Recovery to Expansion," Results from Global Economic Prospects 2005: Trade, Regionalism and Prosperity Global Economic Prospects 2005: Trade, Regionalism and Prosperity (World Bank, 2004), http://globaloutlook.worldbank.org/globaloutlook/outside/globalgrowth.aspx; "World Bank: 2004 Economic Growth Lifts Millions from Poverty," (World Bank, 2004), http://globaloutlook.worldbank.org/globaloutlook/outside/globalgrowth.aspx; "World Bank: 2004 Economic Growth Lifts Millions from Poverty," Voice of America News Voice of America News, http://www.voanews.com/english/2004-11-17-voa41.cfrn.
84. 84. Mark Bils and Peter Klenow, "The Acceleration in Variety Growth," Mark Bils and Peter Klenow, "The Acceleration in Variety Growth," American Economic Review American Economic Review 91.2 (May 2001): 27480, http://www.klenow.com/Acceleration.pdf. 91.2 (May 2001): 27480, http://www.klenow.com/Acceleration.pdf.
85. 85. See notes 84, 86, and 87. See notes 84, 86, and 87.
86. 86. U.S. Department of Labor, Bureau of Labor Statistics, news report, June 3, 2004. You can generate productivity reports at http://www.bls.gov/bls/productivity.htm. U.S. Department of Labor, Bureau of Labor Statistics, news report, June 3, 2004. You can generate productivity reports at http://www.bls.gov/bls/productivity.htm.
87. 87. Bureau of Labor Statistics, Major Sector Multifactor Productivity Index, Manufacturing Sector: Output per Hour All Persons (1996 = 100), http://data.bls.gov/PDQ/outside.jsp?survey=mp (Requires JavaScript: select "Manufacturing," "Output Per Hour All Persons," and starting year 1949), or http://data.bls.gov/cgi-bin/srgate (use series "MPU300001," "AllYears," and Format 2). Bureau of Labor Statistics, Major Sector Multifactor Productivity Index, Manufacturing Sector: Output per Hour All Persons (1996 = 100), http://data.bls.gov/PDQ/outside.jsp?survey=mp (Requires JavaScript: select "Manufacturing," "Output Per Hour All Persons," and starting year 1949), or http://data.bls.gov/cgi-bin/srgate (use series "MPU300001," "AllYears," and Format 2).
88. 88. George M. Scalise, Semiconductor Industry a.s.sociation, in "Luncheon Address: The Industry Perspective on Semiconductors," George M. Scalise, Semiconductor Industry a.s.sociation, in "Luncheon Address: The Industry Perspective on Semiconductors," 2004 Productivity and Cyclicality in Semiconductors: Trends, Implications, and Questions-Report of a Symposium (2004) 2004 Productivity and Cyclicality in Semiconductors: Trends, Implications, and Questions-Report of a Symposium (2004) (National Academies Press, 2004), p. 40, http://www.nap.edu/openbook/0309092744/html/index.html. (National Academies Press, 2004), p. 40, http://www.nap.edu/openbook/0309092744/html/index.html.
89. 89. Data from Kurzweil Applied Intelligence, now part of ScanSoft (formerly Kurzweil Computer Products). Data from Kurzweil Applied Intelligence, now part of ScanSoft (formerly Kurzweil Computer Products).
90. 90. eMarketer, "E-Business in 2003: How the Internet Is Transforming Companies, Industries, and the Economy-a Review in Numbers," February 2003; "US B2C E-Commerce to Top $90 Billion in 2003," April 30, 2003, http://www.emarketer.com/Article.aspx?1002207; and "Worldwide B2B E-Commerce to Surpa.s.s $1 Trillion By Year's End," March 19, 2003, http://www.emarketer.com/Article.aspx?1002125. eMarketer, "E-Business in 2003: How the Internet Is Transforming Companies, Industries, and the Economy-a Review in Numbers," February 2003; "US B2C E-Commerce to Top $90 Billion in 2003," April 30, 2003, http://www.emarketer.com/Article.aspx?1002207; and "Worldwide B2B E-Commerce to Surpa.s.s $1 Trillion By Year's End," March 19, 2003, http://www.emarketer.com/Article.aspx?1002125.
91. 91. The patents used in this chart are, as described by the U.S. Patent and Trademark Office, "patents for inventions," also known as "utility" patents. The U.S. Patent and Trademark Office, Table of Annual U.S. Patent Activity, http://www.uspto.gov/web/offices/ac/ido/oeip/taf/h_counts.htm. The patents used in this chart are, as described by the U.S. Patent and Trademark Office, "patents for inventions," also known as "utility" patents. The U.S. Patent and Trademark Office, Table of Annual U.S. Patent Activity, http://www.uspto.gov/web/offices/ac/ido/oeip/taf/h_counts.htm.
92. 92. The doubling time for IT's share of the economy is twenty-three years. U.S. Department of Commerce, Economics and Statistics Administration, "The Emerging Digital Economy," figure 2, http://www.technology.gov/digeconomy/emerging.htm. The doubling time for IT's share of the economy is twenty-three years. U.S. Department of Commerce, Economics and Statistics Administration, "The Emerging Digital Economy," figure 2, http://www.technology.gov/digeconomy/emerging.htm.
93. 93. The doubling time for U.S. education expenditures per capita is twenty-three years. National Center for Education Statistics, Digest of Education Statistics, 2002, http://nces.ed.gov/pubs2003/ digest02/tables/dt030.asp. The doubling time for U.S. education expenditures per capita is twenty-three years. National Center for Education Statistics, Digest of Education Statistics, 2002, http://nces.ed.gov/pubs2003/ digest02/tables/dt030.asp.
94. 94. The United Nations estimated that the total global equity market capitalization in 2000 was thirty-seven trillion dollars. United Nations, "Global Finance Profile," The United Nations estimated that the total global equity market capitalization in 2000 was thirty-seven trillion dollars. United Nations, "Global Finance Profile," Report of the High-Level Panel of Financing for Development Report of the High-Level Panel of Financing for Development, June 2001, http://www.un.org/reports/financing/profile.htm.
If our perception of future growth rates were to increase (compared to current expectations) by an annual compounded rate of as little as 2 percent, and considering an annual discount rate (for discounting future values today) of 6 percent, then considering the increased present value resulting from only twenty years of compounded and discounted future (additional) growth, present values should triple. As the subsequent dialogue points out, this a.n.a.lysis does not take into consideration the likely increase in the discount rate that would result from such a perception of increased future growth.
Chapter Three: Achieving the Computational Capacity of the Human Brain 1. 1. Gordon E. Moore, "Cramming More Components onto Integrated Circuits," Gordon E. Moore, "Cramming More Components onto Integrated Circuits," Electronics Electronics 38.8 (April 19, 1965): 11417, ftp://download.intel.com/research/silicon/moorespaper.pdf. 38.8 (April 19, 1965): 11417, ftp://download.intel.com/research/silicon/moorespaper.pdf.
2. 2. Moore's initial projection in this 1965 paper was that the number of components would double every year. In 1975 this was revised to every two years. However, this more than doubles price-performance every two years because smaller components run faster (because the electronics have less distance to travel). So overall price-performance (for the cost of each transistor cycle) has been coming down by half about every thirteen months. Moore's initial projection in this 1965 paper was that the number of components would double every year. In 1975 this was revised to every two years. However, this more than doubles price-performance every two years because smaller components run faster (because the electronics have less distance to travel). So overall price-performance (for the cost of each transistor cycle) has been coming down by half about every thirteen months.
3. 3. Paolo Gargini quoted in Ann Steffora Mutschler, "Moore's Law Here to Stay," ElectronicsWeekly.com, July 14, 2004, http://www.electronicsweekly.co.uk/articles/ article.asp?liArticleID=36829. See also Tom Krazit, "Intel Prepares for Next 20 Years of Chip Making," Paolo Gargini quoted in Ann Steffora Mutschler, "Moore's Law Here to Stay," ElectronicsWeekly.com, July 14, 2004, http://www.electronicsweekly.co.uk/articles/ article.asp?liArticleID=36829. See also Tom Krazit, "Intel Prepares for Next 20 Years of Chip Making," Computerworld Computerworld, October 25, 2004, http://www.computer world.com/hardwaretopics/hardware/story/0,10801,96917,00.html.
4. 4. Michael Kanellos, " 'High-rise' Chips Sneak on Market," CNET News.com, July 13, 2004, http://zdnet.com.com/2100-1103-5267738.html. Michael Kanellos, " 'High-rise' Chips Sneak on Market," CNET News.com, July 13, 2004, http://zdnet.com.com/2100-1103-5267738.html.
5. 5. Benjamin Fulford, "Chipmakers Are Running Out of Room: The Answer Might Lie in 3-D," Forbes.com, July 22, 2002, http://www.forbes.com/forbes/2002/0722/173_print.html. Benjamin Fulford, "Chipmakers Are Running Out of Room: The Answer Might Lie in 3-D," Forbes.com, July 22, 2002, http://www.forbes.com/forbes/2002/0722/173_print.html.
6. 6. NTT news release, "Three-Dimensional Nanofabrication Using Electron Beam Lithography," February 2, 2004, http://www.ntt.co.jp/news/news04e/0402/040202.html. NTT news release, "Three-Dimensional Nanofabrication Using Electron Beam Lithography," February 2, 2004, http://www.ntt.co.jp/news/news04e/0402/040202.html.
7. 7. Laszlo Forro and Christian Schonenberger, "Carbon Nanotubes, Materials for the Future," Laszlo Forro and Christian Schonenberger, "Carbon Nano