360 likes | 912 Views
03 26 2008. Particle Swarm Optimization. Particle Swarm Optimization (PSO) . Kennedy, J., Eberhart , R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.
E N D
03 26 2008 Particle Swarm Optimization
Particle Swarm Optimization (PSO) • Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference on Neural Networks (Perth, Australia), lEEE Service Center, Piscataway, NJ, pp. IV: 1942- 1948.
Behavior of Flock of Birds • Self-Experience • Success of Others Self-Experience v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Success of Others
PSO Equation v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Inertia Self-Experience Success of Others ith Particle Velocity: v i Position : x i Global Best Position : p g Previous Best Position : p i
Optimization Problem Parameter Adjustment Output Input System System_1 System_2 Output Input System_3 … System_n n particles
Particle Swarm Optimization v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Cost Cost Cost xk+1 x x x Iteration Inertia Vg xk …… Vp xk-1
Inertia Weight xk+1 Small Inertia Weight Large Inertia Weight W: inertia weight xk+1 v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Vg Vg xk xk Vp Vp xk-1 xk-1
Inertia Weight Large Inertia Weight Cost Cost W: inertia weight v id = w* v id + c1* rand( ) * (p id - x id) + c2* Rand( ) * (p gd - x id) x id = x id + v id Small Inertia Weight x x Global Search Large Inertia Weight Local Search Small
Fuzzy Adaptive PSO Cost • Kennedy, J., Eberhart, R. C. (2001).“Fuzzy adaptive particle swarm optimization,” in Proc. IEEE Int. Congr. Evolutionary Computation, vol. 1, 2001, pp. 101–106. Normalized Current Best Performance Evaluation (NCBPE) Global Search Large x Inertia Weight Fuzzy Adaptive Local Search Small CBPEmax CBPE CBPEmin
Fuzzy Adaptive PSO A description of a fuzzy system for adapting the inertia weight of PSO. Global Search Large Membership Membership Inertia Weight 1 Fuzzy Adaptive L M H NCBPE 0 Local Search Small Membership Membership 1 L M H Weight 0 Fuzzy Rule Membership Membership 1 Fuzzy Rule L M H 0 W_Change
Experimental Results Linearly Decreasing Inertia Weight Minimization The performance of PSO is not sensitive to the population size, and the scalability of the PSO is acceptable. Fuzzy Adaptive Inertia Weight
Application Example1 • Feature Training for Face Detection … … … … Iteration 1 Iteration 2 Iteration k
Application Example2 • Neural Network Training V.G. Gudisz, G.K. Venayagamoorthy, Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks, in: IEEE Swarm Intelligence Symposium 2003 (SIS 2003), Indianapolis, IN, 2003, pp. 110–117.
Introduction of Neural Network ai= W ijX for i=1 to 4, j=1,2 Where X = [x 1]T di = 1 / (1-eai) y = [V1 V2 V3 V4 ][d1 d2 d3 d4 ] T
Neural Network Training • Backpropagation • PSO
Neural Network Training • Backpropagation • PSO
Neural Network Training • Backpropagation • PSO Parameter Set of PSO
Training Results Training 2x4x1 neural network to fit y = 2x2+1 Mean square error curve of neural networks during mining with BP and PSO for bias 1 Test curve for trained neural networks with fixed weights obtained from BP and PSO training algorithm with bias 1
Conclusions • The concept of PSO is introduced. • PSO is an extremely simple algorithm for global optimization problem. • Low memory cost • Low computational cost • Fuzzy system is implemented to dynamically adjust the inertia weight to improve the performance of PSO.