190 likes | 277 Views
Chapter 13 Artificial Life: Learning through Emergent Behavior . Chapter 13 Contents (1). What is life? Emergent Behavior Finite State Automata Conway’s Life One-Dimensional Cellular Automata Self-Reproducing Systems. Chapter 13 Contents (2). Evolution Evolution Strategies
E N D
Chapter 13 Artificial Life: Learning through Emergent Behavior
Chapter 13 Contents (1) • What is life? • Emergent Behavior • Finite State Automata • Conway’s Life • One-Dimensional Cellular Automata • Self-Reproducing Systems
Chapter 13 Contents (2) • Evolution • Evolution Strategies • Genetic Programming • Evolutionary Programming • Classifier Systems • Artificial Immune Systems
What is life? • What are the defining features of life? • self-reproduction • ability to evolve by Darwinian natural selection • response to stimuli • ability to die • growth or expansion • Not all living things obey these rules, and some things that are not alive do. • Defining life is very difficult!
Emergent Behavior • The idea that complex behavior emerges from simple rules. • Is seen in systems such as CYC, but is particularly prevalent in systems based on evolutionary methods, such as genetic algorithms. • Example: • Boids – simulations of birds given very simple rules about how to fly. • Automatically flew in such a way as to avoid large obstacles, without being taught explicitly how to do so.
Finite State Automata • FSA: a machine with afinite number of states. • The FSA is given inputs, which result in transitions between states. • Some states are accepting, meaning the FSA is saying “Yes”. • Other states are rejecting. • In this example there are two possible input characters – a and b, and two states, 1 and 2. • It will finish in state 1 (the accepting state) if the input has an even number of a’s.
Conway’s Life (1) • A two dimensional cellular automaton. • A two dimensional grid of cells, each of which can be alive or dead. • A set of rules determines how the cells will change from one generation to the next: 1. A dead cell will come to life if it has three living neighbors. 2. A living cell with two or three living neighbors, will stay alive. 3. A living cell with fewer than two living neighbors will die of loneliness. 4. A living cell with more than three living neighbors will die of overcrowding.
Conway’s Life (2) • Surprisingly complex behavior can sometimes emerge from these simple rules. • This diagram shows a successive sequence of generations of Conway’s Life. • This pattern is known as a glider. • There is also a pattern known as a glider gun which constantly fires out gliders.
One-Dimensional Cellular Automata (1) • A single line of cells. Each cell thus has two immediate neighbors. • It is usual to have rules that take into account the cells on either side of the immediate neighbors as well. • Usually, the cell itself is also taken into account, meaning that each cell’s future is determined by 5 cells. • 1-D Cellular Automata often use totalistic rules, meaning that the number of living cells out of the 5 is all that determines the cell’s state in the next generation.
One-Dimensional Cellular Automata (2) • Hence, there are 32 possible rule sets. One such set of rules might be: 1 2 3 4 5 0 0 1 1 1 • This says a cell can only survive if it has two, three or four neighbors. • This rule can be seen applied in five generations in the following diagram:
Self-Reproducing Systems • Von Neumann proposed a self-reproducing system based on cellular automata. • Langton invented loops: • Each loop consists of 94 cells. • Contains all the information that is needed to reproduce itself. • Ultimately we may have robots that can obtain the raw materials necessary to produce new versions of themselves. • This would be useful for exploring other planets.
Evolution • The changes in cellular automata involve single-step selection, while evolutionary systems involve cumulative selection (Dawkins, 1991). • Survival of the fittest means that creatures that are fit are more likely to reproduce than those that are less fit. • This idea is modeled exactly in systems such as genetic algorithms.
Evolution Strategies • Similar to hill-climbing. • A set of numeric parameters is varied from generation to generation by making normally distributed changes to the values. • If the offspring is a better solution than the parent, then the process repeats from the offspring. • Otherwise, the offspring is rejected, and a new attempt is made. • This is asexual reproduction – a single parent produces a single offspring.
Genetic Programming • A method used to evolve LISP S-expressions. • The S-expressions are represented as trees. • A random set of expressions is generated, and the “fittest” individuals reproduce to produce the next generation. • Mutation and crossover are used (see chapter 14). • Diagram shows an exampleof a tree representation of anS-expression.
Evolutionary Programming • Evolves finite state automata to solve the problem of identifying the next item in a sequence: a1, a2, a3, a4, a5, …, an. • A new generation of FSAs is made by applying a set of mutation operators: • 1)Changing an output symbol • 2)Changing a state transition • 3)Adding a state • 4)Deleting a state • 5)Changing the initial state • The success of an FSA is determined by seeing how well it predicts the existing sequence.
Classifier Systems (1) • An evolutionary expert system. • Has the following components: • Detectors which receive inputs from the environment. • Effectors which send outputs and carry out actions. • A rule system, which consists of a population of classifiers. Each rule has a measure of fitness. • Detectors to determine how well the system is performing. • A bucket brigade algorithm for assigning credit and blame to classifiers. • A procedure for reproducing classifiers by application of a set of genetic operators. • A set of message lists – for input, output and internal messages.
Classifier Systems (2) • The system uses classifiers to determine what outputs to produce, or what actions to carry out depending on the inputs from the environment. • A classifier has the following form: (c1, c2, c3, c4, c5) -> M, f • c1 … c5 are the input variables; M is the output or action and f is the fitness of the classifier. • An example classifier might be: (4, 2, *, 1, *) -> A2, 9.1 • * represents any input. • This rule states that an input that has 4, 2 and 1 in the first second and fourth positions is classified as A2, and that the classifier has a fitness value of 9.1.
Classifier Systems (3) • When a system has classified an input, a new generation of classifiers is produced by allowing the classifiers that provided the best classifications to reproduce. • A bucket-brigade algorithm is used to assign credit (or blame) to the classifiers. • Classifier systems can be used to solve a number of problems, including playing games and enabling a virtual robot to explore a virtual terrain.
Artificial Immune Systems • Systems modeled on the immune systems in humans and other biological creatures. • Used in anti-virus systems, for example. • Also applied in computer security, for solving combinatorial problems, and for machine learning problems.