860 likes | 961 Views
Sub-mm. NIR. X-ray. Welcome Thanks George Heilborn Weinberg College of Arts and Science; Northwestern Institute on Complex Systems Chicago Council on Science and Technology and the Adler Planetarium A New format The first Heilborn symposium
E N D
Sub-mm NIR X-ray
Welcome • Thanks • George Heilborn • Weinberg College of Arts and Science; Northwestern Institute on Complex Systems • Chicago Council on Science and Technologyand the Adler Planetarium • A New format • The first Heilborn symposium • Committee: J. Ketterson, A. Motter. A. de Gouvea & A. Freeman • Six lectures focusing on complexity and chaos • Particle and planetary orbits, social and biological structures • Videotaped: • heilbornsymposium.northwestern.edu Heilborn Symposium on Complexity and ChaosJanuary 6-8, 2010 Sub-mm NIR X-ray
Heilborn Symposium on Complexity and ChaosJanuary 6-8, 2010 Morning Talks at 11AM Afternoon Talks at 4PM Reception Sub-mm NIR X-ray
Complexity Murray Gell-Mann Distinguished Fellow Santa Fe Institute For Northwestern University
It would take a great many concepts or quantities to capture all our intuitive notions of the meaning of complexity (or its opposite, simplicity). But in most ordinary conversation and in much scientific discourse we mean what I call effective complexity.
Roughly, the effective complexity of an entity refers to the length of a very concise description of its regularities (not the features treated as random or incidental). Complexity does not mean randomness.
We can think of the effective complexity of an entity as the minimum description length (MDL) of its regularities.
A complex business firm, piece of legislation, etc. A complex novel
The distinction between the regular and the random is often context-dependent or even subjective. Example: music and static on the radio as signal and noise, but some of that “noise” yields radio astronomy!
Neckties: Pattern or stains?
Coarse Graining: The concept is generalized to refer to concentration on certain variables rather than the others.
In many problems the practical distinction between the regular and the random depends on a “judge,” not necessarily human or even alive, that defines what is important and what is unimportant.
Scientific vs. Poetic Regularities
Fluctuations around fundamentals in markets were described by some economists as a random walk. They labeled as cranks the various “technicians” and “chartists” who claimed to make probabilistic predictions of future fluctuations from past fluctuations. Perhaps most of them were cranks, but today’s “quants” have made lots of money doing that very thing. The fluctuations contain useful information.
Nature exhibits regularities, non-random phenomena. As Newton remarked, “It is the business of natural philosophy to find them out.” Natural philosophy, of course, is what we now call science.
From Entity to Bit String A bit string is a string of zeros and ones. The entity can be represented by a bit string if we specify: 1. The level of detail (or “coarse graining”) at which it is being described, 2. The language for describing it, 3. The knowledge and understanding of the world that is assumed, and 4. The system of coding from language to bit strings.
A.I.C. (Algorithmic Information Content) of a bit string (or an entity described by it): Length of the shortest program that will cause a given universal computer, U, to print out the bit string and then halt
A.I.C. is a kind of minimum description length. The concept is most useful for long strings. As A.I.C. infinity it no longer depends on U. For an entity e represented by a bit string se we have KU (se) or KU (e) or simply K for the A.I.C. of the string or the entity. 8
A bit string that is perfectly regular, say 111111….1111, has very little A.I.C. and its regularities also have very little A.I.C., so its effective complexity is very low. A bit string with no regularities is a “random,” incompressible string. It has maximal A.I.C. for its length, but the A.I.C. of its regularities is again very low. It too has low effective complexity.
Effective complexity can be high only in the region intermediate between order and disorder.
Split K = AIC of the entity • (or the bit string that describes it) • into two parts: • the AIC of the regularities and • the AIC of the features treated as random or incidental. • The first part is the effective complexity.
The second part can be related to information/ignorance or entropy, which is proportional to negative information
Entropy can be regarded as a measure of disorder. If initially there is a great deal of order of a certain kind in a closed system, then for a period of time the system tends to possess less and less order of the same kind. By order of the same kind we mean order with respect to a given coarse graining.
Fine-grained entropy is exactly conserved, but coarse-grained entropy of a particular kind can be initially low and then increase. If a flask of bromine is uncorked, it starts out confined to the flask, but then spreads out in a complicated pattern, with complicated correlations among the positions and velocities of the molecules. The initial fine-grained information is not lost. It is transformed into information that is ignored by our coarse graining.
In a household with children, we may start the week with peanut butter and jelly in their respective jars, but after a few days we may see a good deal of peanut butter in the jelly jar and vice versa.
With reference to any particular kind of order there are typically very many more disordered configurations than ordered ones, and that explains why initial order in a closed system tends to give way to more and more disorder for a long period of time. If no particular kind of order were defined by a coarse graining, then any configuration would be as good as any other.
If we arrange a lot of pennies on a table according to date and mint and then the dog comes along and upsets the table, the coins are extremely unlikely to end up arranged by date and mint. To get back the order that prevailed,we would have to do a lot of sorting.
In physics, what we are discussing appears as the Second Law of Thermodynamics.
The fundamental laws of physics, which govern the behavior of all matter in the universe and of the universe itself seem to be simple.
The Fundamental Laws of Nature: 1) The unified quantum theory of all the elementary particles (the basic building blocks of all matter) and their interactions. 2) The initial condition of the universe near the beginning of its expansion (around 13 billion years ago). Both of these are thought to be relatively simple.
The simple initial condition of the universe is ultimately responsible for the “Arrow of Time” (in the Second Law of Thermodynamics) that distinguishes macroscopic events from their time-reversed versions (movie shown running backwards), easily recognized as so improbable as to be effectively impossible (egg breaking vs. egg reassembling, etc.)
If we know the exact fundamental laws of physics, the theory of the elementary particles and the initial condition of the universe, can we then predict in principle the behavior of everything in the universe? Absolutely not, because the theory is quantum-mechanical and gives only a set of probabilities. Much is still up to chance. The theory gives the probabilities of infinitely many different possible histories of the universe. Even in the classical approximation, the phenomenon of chaos in many nonlinear systems gives rise to indeterminacy in the presence of even the slightest coarse graining.
The fundamental laws are probabilistic,not fully deterministic. The history of the universe is co-determined by those laws and an unimaginable long sequence of chance events (or “accidents”) governed by probabilities. Example: In the emission of an “alpha-particle” by a radioactive nucleus, all directions are equally probable. Only after the event is it possible to specify the direction of emission.
For any observer (not necessarily human, not necessarily on Earth) most of the accidents that have already happened have unknown results. That results in much greater indeterminacy.
For each such chance event, there are various possible outcomes unknown in advance except for their probabilities. The various possible alternative histories of the universe thus form a branching tree with probabilities at the branchings.
Jorge Luis Borges: The Garden of Forking Paths
The unified theory of the elementary particles and their interactions (even if supplemented by the initial condition of the universe) should not be called “The Theory of Everything.” So much in the world around us depends on the outcomes of the chance events, the “accidents.” the branchings of the tree.
We use some very coarse graining of these histories. Mainly we are concerned with macroscopic topics such as people, diamonds, microbes, stars, paintings or with more abstract entities like money, monarchy, labor unions, contracts.
Sometimes we use “forms,” patterns with continuity. You can’t step into the same river twice.
The alternative histories of the universe form a branching tree with probabilities at all the branchings. Some of the chance events (accidents, branchings) produce much more future regularity than others. Those are the “frozen accidents.” They are the main source of effective complexity, since the fundamental laws are thought to be simple.
h • We have heard of many such • frozen accidents in connection • with classical chaos. • Quantum indeterminacy contributes • as well.
Nowadays many distinguished historians are tolerant • of “contingent” or “ counterfactual” human history. • What if something had gone differently?
In life on Earth right-handed sugars and left-handed amino acids play important roles while the opposite-handed ones do not play those roles, and vice versa. The correlation between the two is understandable, but why are left and right not interchanged? Attempts to explain that phenomenon on the basis of physics seem to have failed, and scientists generally regard it as a frozen accident attributable to the common ancestor of all life on Earth.
It is not only effective complexity that contributes to apparent complexity. Other properties of an entity contribute too.