BestLightNovel.com

Complexity - A Guided Tour Part 2

Complexity - A Guided Tour - BestLightNovel.com

You’re reading novel Complexity - A Guided Tour Part 2 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

But at a value between R = 3.4 and R = 3.5 an abrupt change occurs. Given any value of x0, the system will eventually reach an oscillation among four distinct values instead of two. For example, if we set R = 3.49, x0 = 0.2, we see the results in figure 2.9.

Indeed, the values of x fairly quickly reach an oscillation among four different values (which happen to be approximately 0.872, 0.389, 0.829, and 0.494, if you're interested). That is, at some R between 3.4 and 3.5, the period of the final oscillation has abruptly doubled from 2 to 4.

Somewhere between R = 3.54 and R = 3.55 the period abruptly doubles again, jumping to 8. Somewhere between 3.564 and 3.565 the period jumps to 16. Somewhere between 3.5687 and 3.5688 the period jumps to 32. The period doubles again and again after smaller and smaller increases in R until, in short order, the period becomes effectively infinite, at an R value of approximately 3.569946. Before this point, the behavior of the logistic map was roughly predictable. If you gave me the value for R, I could tell you the ultimate long-term behavior from any starting point x0: fixed points are reached when R is less than about 3.1, period-two oscillations are reached when R is between 3.1 and 3.4, and so on.

When R is approximately 3.569946, the values of x no longer settle into an oscillation; rather, they become chaotic. Here's what this means. Let's call the series of values x0, x1, x2, and so on the trajectory of x. At values of R that yield chaos, two trajectories starting from very similar values of x0, rather than converging to the same fixed point or oscillation, will instead progressively diverge from each other. At R = 3.569946 this divergence occurs very slowly, but we can see a more dramatic sensitive dependence on x0 if we set R = 4.0. First I set x0 = 0.2 and iterate the logistic map to obtain a trajectory. Then I restarted with a new x0, increased slightly by putting a 1 in the tenth decimal place, x0 = 0.2000000001, and iterated the map again to obtain a second trajectory. In figure 2.10 the first trajectory is the dark curve with black circles, and the second trajectory is the light line with open circles.

FIGURE 2.10. Two trajectories of the logistic map for R = 4.0: x0 = 0.2 and x0 = 0.2000000001.



The two trajectories start off very close to one another (so close that the first, solid-line trajectory blocks our view of the second, dashed-line trajectory), but after 30 or so iterations they start to diverge significantly, and soon after there is no correlation between them. This is what is meant by "sensitive dependence on initial conditions."

So far we have seen three different cla.s.ses of final behavior (attractors): fixed-point, periodic, and chaotic. (Chaotic attractors are also sometimes called "strange attractors.") Type of attractor is one way in which dynamical systems theory characterizes the behavior of a system.

Let's pause a minute to consider how remarkable the chaotic behavior really is. The logistic map is an extremely simple equation and is completely deterministic: every xt maps onto one and only one value of xt+1. And yet the chaotic trajectories obtained from this map, at certain values of R, look very random-enough so that the logistic map has been used as a basis for generating pseudo-random numbers on a computer. Thus apparent randomness can arise from very simple deterministic systems.

Moreover, for the values of R that produce chaos, if there is any uncertainty in the initial condition x0, there exists a time beyond which the future value cannot be predicted. This was demonstrated above with R = 4. If we don't know the value of the tenth and higher decimal places of x0-a quite likely limitation for many experimental observations-then by t = 30 or so the value of xt is unpredictable. For any value of R that yields chaos, uncertainty in any decimal place of x0, however far out in the decimal expansion, will result in unpredictability at some value of t.

Robert May, the mathematical biologist, summed up these rather surprising properties, echoing Poincare: The fact that the simple and deterministic equation (1) [i.e., the logistic map] can possess dynamical trajectories which look like some sort of random noise has disturbing practical implications. It means, for example, that apparently erratic fluctuations in the census data for an animal population need not necessarily betoken either the vagaries of an unpredictable environment or sampling errors: they may simply derive from a rigidly deterministic population growth relations.h.i.+p such as equation (1).... Alternatively, it may be observed that in the chaotic regime arbitrarily close initial conditions can lead to trajectories which, after a sufficiently long time, diverge widely. This means that, even if we have a simple model in which all the parameters are determined exactly, long-term prediction is nevertheless impossible.

In short, the presence of chaos in a system implies that perfect prediction a la Laplace is impossible not only in practice but also in principle, since we can never know x0 to infinitely many decimal places. This is a profound negative result that, along with quantum mechanics, helped wipe out the optimistic nineteenth-century view of a clockwork Newtonian universe that ticked along its predictable path.

But is there a more positive lesson to be learned from studies of the logistic map? Can it help the goal of dynamical systems theory, which attempts to discover general principles concerning systems that change over time? In fact, deeper studies of the logistic map and related maps have resulted in an equally surprising and profound positive result-the discovery of universal characteristics of chaotic systems.

Universals in Chaos.

The term chaos, as used to describe dynamical systems with sensitive dependence on initial conditions, was first coined by physicists T. Y. Li and James Yorke. The term seems apt: the colloquial sense of the word "chaos" implies randomness and unpredictability, qualities we have seen in the chaotic version of logistic map. However, unlike colloquial chaos, there turns out to be substantial order in mathematical chaos in the form of so-called universal features that are common to a wide range of chaotic systems.

THE FIRST UNIVERSAL FEATURE: THE PERIOD-DOUBLING.

ROUTE TO CHAOS.

In the mathematical explorations we performed above, we saw that as R was increased from 2.0 to 4.0, iterating the logistic map for a given value of R first yielded a fixed point, then a period-two oscillation, then period four, then eight, and so on, until chaos was reached. In dynamical systems theory, each of these abrupt period doublings is called a bifurcation. This succession of bifurcations culminating in chaos has been called the "period doubling route to chaos."

These bifurcations are often summarized in a so-called bifurcation diagram that plots the attractor the system ends up in as a function of the value of a "control parameter" such as R. Figure 2.11 gives such a bifurcation diagram for the logistic map. The horizontal axis gives R. For each value of R, the final (attractor) values of x are plotted. For example, for R = 2.9, x reaches a fixed-point attractor of x = 0.655. At R = 3.0, x reaches a period-two attractor. This can be seen as the first branch point in the diagram, when the fixed-point attractors give way to the period-two attractors. For R somewhere between 3.4 and 3.5, the diagram shows a bifurcation to a period-four attractor, and so on, with further period doublings, until the onset of chaos at R approximately equal to 3.569946.

FIGURE 2.11. Bifurcation diagram for the logistic map, with attractor plotted as a function of R.

The period-doubling route to chaos has a rich history. Period doubling bifurcations had been observed in mathematical equations as early as the 1920s, and a similar cascade of bifurcations was described by P. J. Myrberg, a Finnish mathematician, in the 1950s. Nicholas Metropolis, Myron Stein, and Paul Stein, working at Los Alamos National Laboratory, showed that not just the logistic map but any map whose graph is parabola-shaped will follow a similar period-doubling route. Here, "parabola-shaped" means that plot of the map has just one hump-in mathematical terms, it is "unimodal."

THE SECOND UNIVERSAL FEATURE: FEIGENBAUM'S CONSTANT.

The discovery that gave the period-doubling route its renowned place among mathematical universals was made in the 1970s by the physicist Mitch.e.l.l Feigenbaum. Feigenbaum, using only a programmable desktop calculator, made a list of the R values at which the period-doubling bifurcations occur (where means "approximately equal to"): R1 3.0.

R2 3.44949.

R3 3.54409 R4 3.564407 R5 3.568759 R6 3.569692 R7 3.569891 R8 3.569934.

R 3.569946.

Here, R1 corresponds to period 21 (= 2), R2 corresponds to period 22 (= 4), and in general, Rn corresponds to period 2n. The symbol ("infinity") is used to denote the onset of chaos-a trajectory with an infinite period.

Feigenbaum noticed that as the period increases, the R values get closer and closer together. This means that for each bifurcation, R has to be increased less than it had before to get to the next bifurcation. You can see this in the bifurcation diagram of Figure 2.11: as R increases, the bifurcations get closer and closer together. Using these numbers, Feigenbaum measured the rate at which the bifurcations get closer and closer; that is, the rate at which the R values converge. He discovered that the rate is (approximately) the constant value 4.6692016. What this means is that as R increases, each new period doubling occurs about 4.6692016 times faster than the previous one.

This fact was interesting but not earth-shaking. Things started to get a lot more interesting when Feigenbaum looked at some other maps-the logistic map is just one of many that have been studied. As I mentioned above, a few years before Feigenbaum made these calculations, his colleagues at Los Alamos, Metropolis, Stein, and Stein, had shown that any unimodal map will follow a similar period-doubling cascade. Feigenbaum's next step was to calculate the rate of convergence for some other unimodal maps. He started with the so-called sine map, an equation similar to the logistic map but which uses the trigonometric sine function.

Feigenbaum repeated the steps I sketched above: he calculated the values of R at the period-doubling bifurcations in the sine map, and then calculated the rate at which these values converged. He found that the rate of convergence was 4.6692016.

Feigenbaum was amazed. The rate was the same. He tried it for other unimodal maps. It was still the same. No one, including Feigenbaum, had expected this at all. But once the discovery had been made, Feigenbaum went on to develop a mathematical theory that explained why the common value of 4.6692016, now called Feigenbaum's constant, is universal-which here means the same for all unimodal maps. The theory used a sophisticated mathematical technique called renormalization that had been developed originally in the area of quantum field theory and later imported to another field of physics: the study of phase transitions and other "critical phenomena." Feigenbaum adapted it for dynamical systems theory, and it has become a cornerstone in the understanding of chaos.

It turned out that this is not just a mathematical curiosity. In the years since Feigenbaum's discovery, his theory has been verified in several laboratory experiments on physical dynamical systems, including fluid flow, electronic circuits, lasers, and chemical reactions. Period-doubling cascades have been observed in these systems, and values of Feigenbaum's constant have been calculated in steps similar to those we saw above. It is often quite difficult to get accurate measurements of, say, what corresponds to R values in such experiments, but even so, the values of Feigenbaum's constant found by the experimenters agree well within the margin of error to Feigenbaum's value of approximately 4.6692016. This is impressive, since Feigenbaum's theory, which yields this number, involves only abstract math, no physics. As Feigenbaum's colleague Leo Kadanoff said, this is "the best thing that can happen to a scientist, realizing that something that's happened in his or her mind exactly corresponds to something that happens in nature."

Mitch.e.l.l Feigenbaum (AIP Emilio Segre Visual Archives, Physics Today Collection).

Large-scale systems such as the weather are, as yet, too hard to experiment with directly, so no one has directly observed period doubling or chaos in their behavior. However, certain computer models of weather have displayed the period-doubling route to chaos, as have computer models of electrical power systems, the heart, solar variability, and many other systems.

There is one more remarkable fact to mention about this story. Similar to many important scientific discoveries, Feigenbaum's discoveries were also made, independently and at almost the same time, by another research team. This team consisted of the French scientists Pierre Coullet and Charles Tresser, who also used the technique of renormalization to study the period-doubling cascade and discovered the universality of 4.6692016 for unimodal maps. Feigenbaum may actually have been the first to make the discovery and was also able to more widely and clearly disseminate the result among the international scientific community, which is why he has received most of the credit for this work. However, in many technical papers, the theory is referred to as the "Feigenbaum-Coullet-Tresser theory" and Feigenbaum's constant as the "Feigenbaum-Coullet-Tresser constant." In the course of this book I point out several other examples of independent, simultaneous discoveries using ideas that are "in the air" at a given time.

Revolutionary Ideas from Chaos.

The discovery and understanding of chaos, as ill.u.s.trated in this chapter, has produced a rethinking of many core tenets of science. Here I summarize some of these new ideas, which few nineteenth-century scientists would have believed.

Seemingly random behavior can emerge from deterministic systems, with no external source of randomness.

The behavior of some simple, deterministic systems can be impossible, even in principle, to predict in the long term, due to sensitive dependence on initial conditions.

Although the detailed behavior of a chaotic system cannot be predicted, there is some "order in chaos" seen in universal properties common to large sets of chaotic systems, such as the period-doubling route to chaos and Feigenbaum's constant. Thus even though "prediction becomes impossible" at the detailed level, there are some higher-level aspects of chaotic systems that are indeed predictable.

In summary, changing, hard-to-predict macroscopic behavior is a hallmark of complex systems. Dynamical systems theory provides a mathematical vocabulary for characterizing such behavior in terms of bifurcations, attractors, and universal properties of the ways systems can change. This vocabulary is used extensively by complex systems researchers.

The logistic map is a simplified model of population growth, but the detailed study of it and similar model systems resulted in a major revamping of the scientific understanding of order, randomness, and predictability. This ill.u.s.trates the power of idea models-models that are simple enough to study via mathematics or computers but that nonetheless capture fundamental properties of natural complex systems. Idea models play a central role in this book, as they do in the sciences of complex systems.

Characterizing the dynamics of a complex system is only one step in understanding it. We also need to understand how these dynamics are used in living systems to process information and adapt to changing environments. The next three chapters give some background on these subjects, and later in the book we see how ideas from dynamics are being combined with ideas from information theory, computation, and evolution.

CHAPTER 3.

Information.

The law that entropy increases-the Second Law of Thermodynamics-holds, I think, the supreme position among the laws of Nature... [I] f your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

-Sir Arthur Eddington, The Nature of the Physical World.

COMPLEX SYSTEMS ARE OFTEN said to be "self-organizing": consider, for example, the strong, structured bridges made by army ants; the synchronous flas.h.i.+ng of fireflies; the mutually sustaining markets of an economy; and the development of specialized organs by stem cells-all are examples of self-organization. Order is created out of disorder, upending the usual turn of events in which order decays and disorder (or entropy) wins out.

A complete account of how such entropy-defying self-organization takes place is the holy grail of complex systems science. But before this can be tackled, we need to understand what is meant by "order" and "disorder" and how people have thought about measuring such abstract qualities.

Many complex systems scientists use the concept of information to characterize and measure order and disorder, complexity and simplicity. The immunologist Irun Cohen states that "complex systems sense, store, and deploy more information than do simple systems." The economist Eric Beinhocker writes that "evolution can perform its tricks not just in the 'substrate' of DNA but in any system that has the right information processing and information storage characteristics." The physicist Murray Gell-Mann said of complex adaptive systems that "Although they differ widely in their physical attributes, they resemble one another in the way they handle information. That common feature is perhaps the best starting point for exploring how they operate."

But just what is meant by "information"?

What Is Information?

You see the word "information" all over the place these days: the "information revolution," the "information age," "information technology" (often simply "IT"), the "information superhighway," and so forth. "Information" is used colloquially to refer to any medium that presents knowledge or facts: newspapers, books, my mother on the phone gossiping about relatives, and, most prominently these days, the Internet. More technically, it is used to describe a vast array of phenomena ranging from the fiber-optic transmissions that const.i.tute signals from one computer to another on the Internet to the tiny molecules that neurons use to communicate with one another in the brain.

The different examples of complex systems I described in chapter 1 are all centrally concerned with the communication and processing of information in various forms. Since the beginning of the computer age, computer scientists have thought of information transmission and computation as something that takes place not only in electronic circuits but also in living systems.

In order to understand the information and computation in these systems, the first step, of course, is to have a precise definition of what is meant by the terms information and computation. These terms have been mathematically defined only in the twentieth century. Unexpectedly, it all began with a late nineteenth-century puzzle in physics involving a very smart "demon" who seemed to get a lot done without expending any energy. This little puzzle got many physicists quite worried that one of their fundamental laws might be wrong. How did the concept of information save the day? Before getting there, we need a little bit of background on the physics notions of energy, work, and entropy.

Energy, Work, and Entropy.

The scientific study of information really begins with the science of thermodynamics, which describes energy and its interactions with matter. Physicists of the nineteenth century considered the universe to consist of two different types of ent.i.ties: matter (e.g., solids, liquids, and vapors) and energy (e.g., heat, light, and sound).

Energy is roughly defined as a system's potential to "do work," which correlates well with our intuitive notion of energy, especially in this age of high-energy workaholics. The origin of the term is the Greek word, energia, which literally means "to work." However, physicists have a specific meaning of "work" done by an object: the amount of force applied to the object multiplied by the distance traveled by the object in the direction that force was applied.

For example, suppose your car breaks down on a flat road and you have to push it for a quarter of a mile to the nearest gas station. In physics terms, the amount of work that you expend is the amount of force with which you push the car multiplied by the distance to the gas station. In pus.h.i.+ng the car, you transform energy stored in your body into the kinetic energy (i.e., movement) of the car, and the amount of energy that is transformed is equal to the amount of work that is done plus whatever energy is converted to heat, say, by the friction of the wheels on the road, or by your own body warming up. This so-called heat loss is measured by a quant.i.ty called entropy. Entropy is a measure of the energy that cannot be converted into additional work. The term "entropy" comes from another Greek word-"trope"-meaning "turning into" or "transformation."

By the end of the nineteenth century two fundamental laws concerning energy had been discovered, the so-called laws of thermodynamics. These laws apply to "isolated systems"-ones that do not exchange energy with any outside ent.i.ty.

First law: Energy is conserved. The total amount of energy in the universe is constant. Energy can be transformed from one form to another, such as the transformation of stored body energy to kinetic energy of a pushed car plus the heat generated by this action. However, energy can never be created or destroyed. Thus it is said to be "conserved."

Second law: Entropy always increases until it reaches a maximum value. The total entropy of a system will always increase until it reaches its maximum possible value; it will never decrease on its own unless an outside agent works to decrease it.

As you've probably noticed, a room does not clean itself up, and Cheerios spilled on the floor, left to their own devices, will never find their way back into the cereal box. Someone or something has to do work to turn disorder into order.

Furthermore, transformations of energy, such as the car-pus.h.i.+ng example above, will always produce some heat that cannot be put to work. This is why, for example, no one has found a way to take the heat generated by the back of your refrigerator and use it to produce new power for cooling the inside of the refrigerator so that it will be able to power itself. This explains why the proverbial "perpetual motion machine" is a myth.

The second law of thermodynamics is said to define the "arrow of time," in that it proves there are processes that cannot be reversed in time (e.g., heat spontaneously returning to your refrigerator and converting to electrical energy to cool the inside). The "future" is defined as the direction of time in which entropy increases. Interestingly, the second law is the only fundamental law of physics that distinguishes between past and future. All other laws are reversible in time. For example, consider filming an interaction between elementary particles such as electrons, and then showing this movie to a physicist. Now run the movie backward, and ask the physicist which version was the "real" version. The physicist won't be able to guess, since the forward and backward interactions both obey the laws of physics. This is what reversible means. In contrast, if you make an infrared film of heat being produced by your refrigerator, and show it forward and backward, any physicist will identify the forward direction as "correct" since it obeys the second law, whereas the backward version does not. This is what irreversible means. Why is the second law different from all other physical laws? This is a profound question. As the physicist Tony Rothman points out, "Why the second law should distinguish between past and future while all the other laws of nature do not is perhaps the greatest mystery in physics."

Maxwell's Demon.

The British physicist James Clerk Maxwell is most famous for his discovery of what are now called Maxwell's Equations: compact expressions of Maxwell's theory that unified electricity and magnetism. During his lifetime, he was one of the world's most highly regarded scientists, and today would be on any top fifty list of all-time greats of science.

In his 1871 book, Theory of Heat, Maxwell posed a puzzle under the heading "Limitation of the Second Law of Thermodynamics." Maxwell proposed a box that is divided into two halves by a wall with a hinged door, as ill.u.s.trated in figure 3.1. The door is controlled by a "demon," a very small being who measures the velocity of air molecules as they whiz past him. He opens the door to let the fast ones go from the right side to the left side, and closes it when slow ones approach it from the right. Likewise, he opens the door for slow molecules moving from left to right and closes it when fast molecules approach it from the left. After some time, the box will be well organized, with all the fast molecules on the left and all the slow ones on the right. Thus entropy will have been decreased.

FIGURE 3.1. Top: James Clerk Maxwell, 18311879 (AIP Emilio Segre Visual Archives) Bottom: Maxwell's Demon, who opens the door for fast (white) particles moving to the left and for slow (black) particles moving to the right.

According to the second law, work has to be done to decrease entropy. What work has been done by the demon? To be sure, he has opened and closed the door many times. However, Maxwell a.s.sumed that a ma.s.sless and frictionless "slide" could be used as a door by the demon, so that opening and closing it would require negligible work, which we can ignore. (Feasible designs for such a door have been proposed.) Has any other work been done by the demon?

Maxwell's answer was no: "the hot system [the left side] has gotten hotter and the cold [right side] colder and yet no work has been done, only the intelligence of a very observant and neat-fingered being has been employed."

How did entropy decrease with little or no work being done? Doesn't this directly violate the second law of thermodynamics? Maxwell's demon puzzled many of the great minds of the late nineteenth and early twentieth centuries. Maxwell's own answer to his puzzle was that the second law (the increase of entropy over time) is not really a law at all, but rather a statistical effect that holds for large collections of molecules, like the objects we encounter in day-to-day life, but does not necessarily hold at the scale of individual molecules.

However, many physicists of his day and long after vehemently disagreed. They believed that the second law has to remain inviolate; instead there must be something fishy about the demon. For entropy to decrease, work must actually have been done in some subtle, nonapparent way.

Many people tried to resolve the paradox, but no one was able to offer a satisfactory solution for nearly sixty years. In 1929, a breakthrough came: the great Hungarian physicist Leo Szilard (p.r.o.nounced "ziLARD") proposed that it is the "intelligence" of the demon, or more precisely, the act of obtaining information through measurement, that const.i.tutes the missing work.

Szilard was the first to make a link between entropy and information, a link that later became the foundation of information theory and a key idea in complex systems. In a famous paper ent.i.tled "On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings," Szilard argued that the measurement process, in which the demon acquires a single "bit" of information (i.e., the information as to whether an approaching molecule is a slow one or a fast one) requires energy and must produce at least as much entropy as is decreased by the sorting of that molecule into the left or right side of the box. Thus the entire system, comprising the box, the molecules, and the demon, obeys the second law of thermodynamics.

In coming up with his solution, Szilard was perhaps the first to define the notion of a bit of information-the information obtained from the answer to a yes/no (or, in the demon's case, "fast/slow") question.

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Complexity - A Guided Tour Part 2 summary

You're reading Complexity - A Guided Tour. This manga has been translated by Updating. Author(s): Melanie Mitchell. Already has 586 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

BestLightNovel.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to BestLightNovel.com