400 likes | 552 Views
Constrained optimization algorithms. Gradient Search (DFP, SQP) Genetic Algorithms Simulated Annealing Simplex (Nelder-Mead) Differential Evolution Algorithm Self-adaptive Response Surface (IOSO) & NNA.
E N D
Constrained optimization algorithms • Gradient Search (DFP, SQP) • Genetic Algorithms • Simulated Annealing • Simplex (Nelder-Mead) • Differential Evolution Algorithm • Self-adaptive Response Surface (IOSO) & NNA
Why semi-stochastic optimization?Because gradient-based optimization is incapable of solving such multi-extremal multi-objective constrained problems.
The self-adapting response surface formulation used in this optimizer allows for incorporation of realistic non-smooth variations of experimentally obtained data and allows for accurate interpolation of such data.
The main benefits of this algorithm are its outstanding reliability in avoiding local minimums, its computational speed, and a significantly reduced number of required experimentally evaluated alloy samples as compared to more traditional optimizers like genetic algorithms.
Parallel Computer of a “Beowulf” type • Based on commodity hardware and public domain software • 16 dual Pentium II 400 MHz and 11 dual Pentium 500 MHz based PC’s • Total of 54 processors and 10.75 GB of main memory • 100 Megabits/second switched Ethernet using MPI and Linux • Compressible NSE solved at 1.55 Gflop/sec with a LU SSOR solver on a 100x100x100 structured grid on 32 processors (like a Cray-C90)