1 / 6

GA/ICA Workshop

GA/ICA Workshop. Carla Benatti 3/15/2012. Proposed Thesis Project. Tuning a Beam Line Model/design of system provides nominal values for tune Operators adjust each element individually to optimize tune Slow process, room for improvement Tuning Algorithm and Optimizer

wanda-barry
Download Presentation

GA/ICA Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GA/ICA Workshop Carla Benatti 3/15/2012

  2. Proposed Thesis Project • Tuning a Beam Line • Model/design of system provides nominal values for tune • Operators adjust each element individually to optimize tune • Slow process, room for improvement • Tuning Algorithm and Optimizer • Develop new, fast, tuning algorithm • Using neural networks, genetic algorithms possibly • Model Independent Analysis • Benchmark code at ReA3 • Design experiment to test optimizer • Compare results with tuning “by hand” • User friendly application, possibly GUI LB source, L-line at ReA3 COSY Envelope tracking calculation LB004 LB006 L051 L054 L057 L061

  3. Artificial Neural Network (ANN) Hidden layer(s) • Neural Network Summary • Attempts to simulate the functionality of the brain in a mathematical model • Ideal for modeling complex relationships between inputs and outputs as a “black box” solver • Ability to learn, discern patterns, model nonlinear data • Reliability of prediction • Many different models already developed for finding local and global minimum for optimization • Neural Network Programming • Neuron receives weighted input • If above threshold, generates output through nonlinear function • Connecting single neurons together creates a neural network • Learning, training: get ANN to give a desired output, supervised or unsupervised learning (GA example) Perceptron Input layer 1 Output layer y = Output w = Weights x = Inputs b = Threshold φ = Non-linear Function 1 x1 2 Neuron 2 x2 3 x1 w1 Multilayer Perceptron w2 x2 y k xN • Basic ANN example • Hierarchical structure • Feed-forward network wN m wN xN Neuron

  4. Genetic Algorithms • Machine learning technique • Effective tool to deal with complex problems by evolving creative and competitive solutions • Genetic Algorithms search for the optimal set of weights, thresholds for neurons • Crossover is the most used search operator in Genetic Programming (0.3, -0.8, -0.2, 0.6, 0.1, -0.1, 0.4, 0.5) Terminate End Iterate Elitism Genetic Modification Examples Parents (0.3, -0.8, -0.2, 0.6, 0.1, -0.1, 0.4, 0.5) (0.7, 0.4, -0.9, 0.3, -0.2, 0.5, -0.4, 0.1) Crossover (0.7, 0.4, -0.9, 0.6, 0.1, -0.1, 0.4, 0.5) Reproduction Mutation (0.7, 0.4, -0.9, 0.6, 0.1, -0.3, 0.4, 0.5) http://www.ai-junkie.com/ann/evolved/nnt7.html

  5. SmartSweepers Tutorial Code • NeuralNet.m • NeuralNet_CalculateOutput.m • Genetic_Algorithm.m Best Fitness Average Fitness http://www.ai-junkie.com/ann/

  6. http://www.ai-junkie.com/index.html Good source for first time learning about genetic algorithms and neural networks Explains concepts in “plain English” Goes through some coding examples to play with crossover/mutation parameters

More Related