170 likes | 312 Views
Adaptive Hopfield Network. Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo Toledo, Ohio, USA. Presentation Topics. Motivation for research Classical Hopfield network (HN) Adaptation – Gradient Descent
E N D
Adaptive Hopfield Network Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo Toledo, Ohio, USA
Presentation Topics • Motivation for research • Classical Hopfield network (HN) • Adaptation – Gradient Descent • Adaptive Hopfield Network (AHN) • Static Optimization with AHN • Results and Conclusions Serpen et al., Upcoming Journal Article(Insallah!) http://www.eecs.utoledo.edu/~serpen FOR MORE INFO...
Motivation • Classical Hopfield neural network (HN) has been shown to have the potential to address a very large spectrum of static optimization problems. • Classical HN is NOT trainable: implies that it can NOT learn from prior search attempts. • A hardware realization of the Hopfield network is very attractivefor real-time, embeddedcomputing environments. • Is there a way (e.g., training or adaptation) to incorporate the experience (gained as a result of prior search attempts)into the network dynamics (weights)to help the network focus on promising regions of the overall search space?
Research Goals • Propose gradient-descent based procedures to “adapt” weights and constraint weighting coefficients of HN. • Develop an indirect procedure to define “pseudo” values for desired neuron outputs (much like the way desired output values for hidden layer neurons in an MLP). • Develop space-efficient schemes to store the symmetric weight matrix (upper/lower triangular) for large-scale problem instances. • Apply (through simulation) the adaptive HN algorithm to (large-scale) static optimization problems.
Classical Hopfield Net Dynamics Number of Neurons Neuron Dynamics Sigmoid function
Weights (interconnection) - Redefined Liapunov Function Generic Decomposed Weights Defined
Adaptive Hopfield NetPseudoCode • Initialization • Initialize network constraint weighting coefficients. • Initialize weights. • Initialize Hopfield net neuron outputs (randomly). • Adaptive Search Relaxation • Relax Hopfield dynamics until convergence to a fixed point. Adaptation • Relax Adjoint network until convergence to a fixed point. • Update weights. • Update constraint weighting coefficients. • Termination Criteria • if not satisfied, continue with Adaptive Search.
Adaptation of WeightsAdjoint Hopfield Network Adjoint Network
Adaptation of WeightsRecurrent BackProp Weight Update – Recurrent BackProp
AdaptationConstraint Weighting Coefficients Gradient Descent Adaptation Rule Error Function – Problem Specific and Redefined
AdaptationConstraint Weighting Coefficients Partial Derivative – Readily Computable Final Form of Coefficient Update Rule
Mapping A Static Optimization Problem Generic Partial Problem-Specific Partial
Simulation Study • Traveling Salesman Problem • A preliminary work at this time • Up to 100 cities performed • Computing Resources – Ohio Supercomputing Center • Preliminary findings suggest that the theoretical framework is sound and projections are valid • Computational cost (weight matrix size) poses significant challenge for simulation purposes – on going research effort • Currently in progress
Conclusions • An adaptation mechanism, which modifies constraint weighting coefficient parameter values and weights of the classical Hopfield network, was proposed. • A mathematical characterization of the adaptive Hopfield network was presented. • Preliminary simulation results suggest the proposed adaptation mechanism to be effective in guiding the Hopfield network towards high-quality feasible solutions of large-scale static optimization problems. • We are also exploring incorporating a computationally viable stochastic search mechanism to further improve quality of solutions computed by the adaptive Hopfield network while preserving parallel computation capability.
Thank You ! • Questions ? We gratefully acknowledge the computing resources grant provided by the State of Ohio Supercomputing Center (in USA) in facilitating the simulation study. We appreciate the support provided by the Kohler Internationalization Awards Program at the University of Toledo to facilitate this conference presentation.