190 likes | 333 Views
Optimization of thermal processes 2007/2008. Optimization of thermal processes. Lecture 12. Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery. Optimization of thermal processes 2007/2008. Overview of the lecture.
E N D
Optimization of thermal processes 2007/2008 Optimization of thermal processes Lecture 12 Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery
Optimization of thermal processes 2007/2008 Overview of the lecture • Overview of some modern methods of optimization • Genetic algorithms (GA) • Simulated annealing • Neural-network-based optimization
Optimization of thermal processes 2007/2008 Genetic algorithms (introduction) If a design problem is characterised by: • mixed continuous and discrete variables • discontinuous or nonconvex desing spaces (feasible regions) then standard techniques may be inefficient. It is also possible that only relative optimum closest to the starting point will be found. Feasible region Feasible region Genetic algorithms (GA) in many cases can find global optimum with high probability.
Better solution Population of solutions 101001000101 101000100111 001000100101 101110100101 101000101101 solution 101011100101 Cross-over Mutation Reproduction parents 101000101101 101000101101 101110100101 1010010111 101110100101 Only „good” solutions may reproduce. offspring 1010010111 1011010111 „Good” solutions are reproduced in next generations. Random change in the solution. Optimization of thermal processes 2007/2008 Genetic algorithms are based on Darwin’s theory of survival of the fittest (natural selection). Genetic algorithms (introduction) Charles Darwin (1809-1882)
Optimization of thermal processes 2007/2008 Genetic algorithms (introduction) Characteristics of GA: • A population of trial desing vectors is used for the starting procedure (less likely to get trapped in a local optimum) • GA use only the value of the objective function (direct method) • Design variables represented as strings of binary variables – naturally applicable for integer programming. In the case of continuous variables, they have to be approximated with discrete ones • The objective function value plays the role of fitness • In every new generation (iteration): • Parents are selected at random (from sufficiently good solutions) • Crossover occurs and new solution is obtained • GA is not just a random search technique – solutions with better value of objective function (better fitness) are favoured
q binary numbers Representation of a continuous variable (resolution depends on q): ... Continuous variable Optimization of thermal processes 2007/2008 In GA the design variables are represented as strings of binary numbers, 0 and 1. Genetic algorithms – representation of design variables String of length 20 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 Decimal number Binary number
Penalty parameter Fitness function The largest value of in the population Optimization of thermal processes 2007/2008 GA finds solution of an unconstrained problem. To solve a constrained minimization problem, two transformations have to made: Genetic algorithms – representation of objective function and constraints • transformation into uncostrained problem, i.e. with the use of penalty function method • transformation into maximization of the fitness function Minimize
Population of K solutions 101001000101 101000100111 001000100101 101110100101 101000101101 101011100101 Strings are selected for reproduction with the probability: The larger the fitness function the larger probability of selection for reproduction. Selected parents 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 1 1 0 1 0 1 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Now, crossover occurs Optimization of thermal processes 2007/2008 ... Genetic algorithms – genetic operators f2 fK f1 Reproduction Every solution has a value of the fitness function Note: • Highly fit individuals live and reproduce • Less fit individuals „die”
Crossover site – selected at random Exchange of substrings 1 0 0 1 0 0 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Offspring 1 1 1 0 1 0 1 0 0 1 1 0 0 0 0 1 0 0 1 0 0 Offspring 2 Optimization of thermal processes 2007/2008 Genetic algorithms – genetic operators Crossover 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 Parent 1 1 1 0 1 0 1 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Parent 2 The new strings are placed in the new population. The process is continued.
Some design vector Random location 0 1 Mutation 1 0 New design vector 1 0 0 1 0 0 1 1 0 1 0 1 1 0 0 1 0 1 1 0 Optimization of thermal processes 2007/2008 Genetic algorithms – genetic operators Mutation Occasional random alteration of a binary digit 1 0 0 1 0 0 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Mutation introduces random change in the genetic material. It helps to find global optimum.
Optimization of thermal processes 2007/2008 • Simulated annealing belongs to random search methods. However, it is • designed to move toward the global minimum of the objective function. • To see the drawbacks of a „naive” random search method, let’s consider • the following alogrithm: • Choose (at random) an initial starting point X1 • Make random moves along each coordinate direction – go to the point X* • If f(X*)>f(X1) reject the point X* and find a new one. Otherwise, accept the point X* as a new starting point X1 and go to step 2. • Repeat, until the objective function can’t be reduced further. Simulated annealing (introduction) The problem with such an algorithm is that it may stuck in a local optimum.
First step. Increase of the objective function – point rejected. Global optimum Second step. Point accepted. Local optimum Third step. Point accepted. We can’t leave this point Optimization of thermal processes 2007/2008 Simulated annealing – naive random search method Thus, in this version of the random search method if we find the local optimum, there is no way to leave this point.
Metropolis criterion This move is accepted with a probability: where Increase of objective function Temperature This move is accepted unconditionally, as the objective function is reduced. As . So, the largest temperature, the less constrained are the movements. Optimization of thermal processes 2007/2008 Simulated annealing – the main concept With the use of simulated annealing technique transitions out of a local minimum are possible.
Optimization of thermal processes 2007/2008 Simulated annealing – the main concept The algorithm starts with a high temperature (large value of T) and in the subsequent steps the temperature is reduced slowly. The global optimum is found with a high probability even for objective function with many local minima. The change of T is defined by so called cooling schedule. The name of the method is derived from simulation of thermal annealing of solids (metals). A slow cooling of a heated solid ensures proper solidification with highly ordered crystalline structure. Rapid cooling causes defects inside the material. Naive random search – rapid cooling Simulated annealing – slow cooling High internal energy – local optimum Lowest internal energy – global minimum
Optimization of thermal processes 2007/2008 Simulated annealing – some of the features • The quality of the final solution is not affected by the initial guess (however, computational time may increase with worse starting point). • The objective function doesn’t have to be regular (continuous, differentiable). • The feasible region doesn’t have to be convex (the convergence is not influenced by the convexity). • The method can be used to solve mixed-integer, discrete or continuous problems. • For the problems with constraints modified objective function may be formulated, just as in the case of genetic algorithms (i.e. penalty function approach).
Optimization of thermal processes 2007/2008 Neural-network-based optimization A neural network is a parallel network of interconnected simpre processors (neurons). A neuron accepts a set of inputs from other neurons and computes an output. Single neuron a The weights wi are not specified but they are determined in the learning process.
Optimization of thermal processes 2007/2008 Neural-network-based optimization Neurons may be connected to form multilayer networks. Output layer Hidden layer Input layer Such a network may be trained to „solve” specific problems.
Optimization of thermal processes 2007/2008 Neural-network-based optimization • The strength of the various interconnections (weights) may be considered as the representation of the knowledge contained in the network • The network is trained to minimize the error between the actual output of the output layer and the target output for all the input patters • The training is just selecting the weigths wi • The learning schemes govern as to how the wieghts are to be varied to minimize the error Possible usage: • Train the network for a specific set of input patterns (supply input parameters and solutions to the given problems) • Supply input parameters different from that of the training set • The network should return the solution of the problem (approximate, at least)
Optimization of thermal processes 2007/2008 Thank you for your attention