390 likes | 565 Views
Adaptive Resonance Theory. Unsupervised Learning. Learning Objectives. Introduction ART architecture ART implementation ART2 Conclusion. Introduction. Developed by Carpenter and Grossberg ART1-accept binary data ART2-accept continuous value data ART3-improved ART
E N D
Adaptive Resonance Theory Unsupervised Learning 38
Learning Objectives • Introduction • ART architecture • ART implementation • ART2 • Conclusion
Introduction • Developed by Carpenter and Grossberg • ART1-accept binary data • ART2-accept continuous value data • ART3-improved ART • Fuzzy ART- accept fuzzy data
ART Architecture (1/20) • Brief description: • a. Accept an input vector and classify it into one of categories. • b. The category it belongs to depend on which one it most resembles. • c. If the input vector does not match any of the stored pattern, a new category is created.
ART Architecture (2/20) • d. If the input vector matched (within vigilance level) any of stored pattern, the pattern is adjusted (trained) to make it more like the input vector. • 2. Simplified ART Architecture • a. two layers: comparison(F1), recognition(F2) • b. 3 control function: Gain 1, Gain 2, and Reset.
ART Architecture (4/20) • 3. Functions of each models: • a. Comparison layer: • -accept binary input X • -initially pass X to C . So C = X • -binary vector R is produced from recognition layer to modify C
ART Architecture (6/20) • -each neuron in the comparison layer has 3 inputs: • X: input vector • Pj: weighted sum of recognition layer output • Gain 1: same signal to all neurons • -use “two-third” rule • => at least two of a neuron’s three inputs must be one, otherwise, the output is zero. • -initially, Gain 1 is set to one and R are set to 0.
ART Architecture (7/20) • b. Recognition layer: • -compute dot product of B and C • -the neuron with largest output wins • -the winning neuron is set to one others are set to zero. • Gain 2 • -OR of components ofX
ART Architecture (9/20) • Gain 1 • OR of X • components G2 OR of R G1 • ---------------------------------------------------- • 0 0 0 0 • 1 1 0 1 • 1 1 1 0 • 0 0 1 0
ART Architecture (10/20) • Reset • -measures the similarity between X and C
ART Architecture (11/20) • 4. Operations • a. recognition phase • i) when no input vector X , then • G2 = 0 • ii) G2 = 0 disables all neurons in recognition layer • => R=0 • iii) This makes sure all neurons in this layer start out at same state.
ART Architecture (13/20) • iv) Then, vector X is applied, this X must have at least one component is “1”. (∵OR of X is “1” => G2=1; ∵OR of R is “0” => G1=1). • v) So G2, G1 =1, and C = X. • vi) dot product Bj.C. find neuron j that has the largest result, fire “one”, others fire “zero”. • vii) This jth neuron has output rjof R equal to one, and all others equal to zero.
ART Architecture (15/20) • b. Comparison phase • i) rj going through weight tji (binary) to each neuron in comparison layer providing an input signal Pi. • ii) Now R is not zero anymore, G1 = 0 and by 2/3 rule, the components of C will be one only when X and P are ones.
ART Architecture (16/20) • iii) If there is a substantial mismatch between X and P, C contains many zeros, while X contains ones. This will trigger Reset function to inhabit the output of firing neuron in the recognition layer to zero and disable it for the duration of the current classification.
ART Architecture (18/20) • c. Search phase • i) if no Reset signal generated, the match is in tolerate level, and the classification is finished. Otherwise, search other nodes. ii) to search other node, let rj = 0 => R = 0 => G1 =1 => C = X again. • iii) different neuron wins => different pattern P is fed back to comparison layer.
ART Architecture (19/20) • A stored pattern is found then the network enters training cycle. • All stored pattern are tried and no match , assign new neuron set Bj and Tj.
ART Architecture (20/20) • d. Performance issues: • i) sequential search • ii) stabilization
ART Implementation(1/3) • Initialization • Tj, Bj, Vigilance level • Bj: • 0< bij < L/(L-1+m) for all i, j • m: # of components in the input vector • L: as constant > 1 (L=2, typically) • all bij are the same value • Tj: tij=1 for all i, j • ρ: 0<ρ<1 coarse distinction at first, fine distinction at last.
ART Implementation(2/3) • 2. Recognition • NETj = (BjC) • OUTj=1 NETj>T • 0 otherwise • 3. Comparison 4. Search Searching process is going until a pattern is matched or no pattern is matched.
ART Implementation(3/3) • 5. Training • If the input X is matched, for newly stored Tj :
ART Algorithm(1/4) • Initialization • L > 1, 0<ρ<1 • 0< bij < L/(L-1+m) for all i, j • tij=1 for all i, j • 2. While stopping condition is false, do 3-12: • 3. For each training pattern(X), • do 4-11: • 4. Set R = 0,C = X • 5. For each node in recognition layer that is not inhibited (r-1):
ART Algorithm(2/4) • 6.If rj -1, rj =(BjC) • 7. While Reset is true • 8. Find winning neuron j, If rj -1 then this pattern cannot be clustered. • 9. Let P=TJ, • compute C
ART Algorithm(3/4) • 10. 11. Update BJ, TJ=C 12. Test for stopping conditions a. no weight change b. maximum number of epochs reached.
ART Algorithm(4/4) ** Stopping conditions: a. no weight change b. maximum number of epochs reached.
ART2 (1/9) • Several sublayers replace F1 layer • Update functions: • Parameters: • m: dimension of input vector • n: number of cluster units
ART2 (3/9) • Parameters(continue): • a, b: fixed weight in F1, cannot be zero. • c: fixed weight used in testing for Reset, 0<c<1. • d: output activation of F2, • e: a small parameter to prevent division by zero when norm of a vector is zero. • : noise suppression parameter • : learning rate • : vigilance level
ART2 (4/9) • Algorithm: • a. Initialization: • a, b, c, d, e, , , • b. perform # of epochs of training • c. For each input pattern, S, • do d. to m. • d. update F1
ART2 (5/9) • d. update F1 again e. compute signal to F2
ART2 (6/9) • f. While Reset is true, do g. to h. • g. Find winning neuron J in F2 • h. Check for Reset:
ART2 (7/9) • i) If ||R|| < - e, • then yJ = -1 (inhibit J) Reset is true; repeat f. • ii) If ||R|| >= - e, • then
ART2 (8/9) • i. Do j. to l. for # of learning iterations. • j. update weight for neuron J k. Update F1
ART2 (9/9) • l. Test stopping condition for weight updates. • m. Test stopping condition for # of epochs.