1 / 23

Outline

A Hierarchical Self-organizing Associative Memory for Machine Learning Janusz A. Starzyk, Ohio University Haibo He, Stevens Institute of Technology Yue Li, O2 Micro Inc. Outline. Introduction; Associative learning algorithm; Memory network architecture and operation; Simulation analysis;

harlan
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Hierarchical Self-organizing Associative Memory forMachine LearningJanusz A. Starzyk, Ohio UniversityHaibo He, Stevens Institute of TechnologyYue Li, O2 Micro Inc

  2. Outline • Introduction; • Associative learning algorithm; • Memory network architecture and operation; • Simulation analysis; • Conclusion and future research;

  3. Introduction: A biological point of view • Source: “The computational brain” by • P. S. Churchland and T. J. Sejnowski Memory is a critical component for understanding and developing natural intelligent machines/systems The question is: How???

  4. Characteristics: * Self-organization * Sparse and local interconnections * Dynamically reconfigurable * Online data-driven learning Remote neurons System clock Nearest neighbour neuron Other Neurons II: information index ID: information deficiency Introduction: self-organizing learning array(SOLAR)

  5. Introduction: from SOLAR to AM • Feed forward • Feed backward • Feed forward only • Characteristics: • Self-organization; • Sparse and local interconnections; • Feedback propagation; • Information inference; • Hierarchical organization; • Robust and self-adaptive; • Capable of both hetero-associative (HA) and auto-associative (AA)

  6. Outline • Introduction; • Associative learning algorithm; • Memory network architecture and operation; • Simulation analysis; • Conclusion and future research;

  7. Basic learning element Self-determination of the function value: An example:

  8. Signal strength (SS) • Provides a coherent way to determine when to trigger an association; • Helps to resolve multiple feedback signals; Signal strength (SS) =| Signal value – logic threshold| (SS range: [0, 1])

  9. Three types of associations • IOA: Input only association; • OOA: Output only association; • INOUA: Input-output association;

  10. Probability based associative learning algorithm • Case 1: Given the values of both inputs, decide the output value;

  11. Probability based associative learning algorithm • Case 2: Given the values of one input and an un-defined output, decide the value of the other input; For instance:

  12. Probability based associative learning algorithm • Case 3: Given the values of the output, decide the values of both inputs;

  13. Probability based associative learning algorithm • Case 4: Given the values of one input and the output, decide the other input value; For instance:

  14. Outline • Introduction; • Associative learning algorithm; • Memory network architecture and operation; • Simulation analysis; • Conclusion and future research;

  15. Input data ?.? Depth Network operations Input data Depth Feed forward operation Feedback operation

  16. Memory operation Defined signal Input data 1 3 Recovered signal Signal resolved based on SS 5 4 2 Undefined signal

  17. Outline • Introduction; • Associative learning algorithm; • Memory network architecture and operation; • Simulation analysis; • Conclusion and future research;

  18. Hetero-associative memory: Iris database classification 3 classes, 4 numeric attributes, 150 instances N-bits sliding-bar coding mechanism: Features: Class identity labels: In our simulation: N=80, L=20, M=30

  19. Neuron association pathway Classification accuracy: 96%

  20. Auto-associative memory: Panda image recovery 64 x 64 binary panda image: for a black pixel; for a white pixel; 30% missing pixels Error: 0.4394% Error: 2.42% Block half Original image 64x64 binary image

  21. Outline • Introduction; • Associative learning algorithm; • Memory network architecture and operation; • Simulation analysis; • Conclusion and future research;

  22. Conclusion and future research • Hierarchical associative memory architecture; • Probabilistic information processing, transmission, association and prediction; • Self-organization; • Self-adaptive; • Robustness;

  23. Future research It’s all about design natural intelligent machines ! • Multiple-inputs (>2) association mechanism; • Dynamically self-reconfigurable; • Hardware implementation; • Facilitate goal-driven learning; • Spatio-temporal memory organization; How far are we??? 3DANN “Brain On Silicon” will not just be a dream or scientific fiction in the future! Picture source: http://www.cs.utexas.edu/users/ai-lab/fai/; and Irvine Sensors Corporation (Costa Mesa, CA)

More Related