280 likes | 371 Views
Back to Basics: Classification and Inference Based on Input Feedback Structure. Tsvi Achler Eyal Amir. Department of Computer Science University of Illinois at Urbana-Champaign. AI -> AGI. Ability to generalize Even if only learned basics Training distribution ≠ test distribution
E N D
Back to Basics: Classification and Inference Based on Input Feedback Structure Tsvi Achler Eyal Amir Department of Computer Science University of Illinois at Urbana-Champaign
AI -> AGI • Ability to generalize • Even if only learned basics • Training distribution ≠ test distribution • Avoid Combinatorial Explosion • Allows complex networks
New Basic Computational Structure • Based on massive feedback to inputs • No emphasis on weight parameters • Input Feedback during testing
Y2 Y2 I2 I2 I1 I1 I3 I3 Avoids Combinatorial Explosionvia Simple Connectivity Y1 Y3 Y4 Output Nodes I4 Input Nodes x4 x1 x2 x3 Connections: Positive Negative
1 1 å å = = 1 1 wxy N N x Forward Connections: Iterative Y1 Y2 I2 I1 x1 x2 1 thus Wy=
Back Y1 Y2 I2 I1 x1 x2
Forward Y1 Y2 I2 I1 x1 x2
Back Y1 Y2 I2 I1 x1 x2
Active (1) Y1 Y2 Inactive (0) I2 I1
Active (1) C2 Inactive (0) I2 I1
Active (1) C2 Inactive (0) I2
Active (1) Inactive (0)
Active (1) Inactive (0)
Active (1) Inactive (0) I2
Active (1) Inactive (0)
Steady State Graph of Dynamics 1 … Y1 Y2 Activity 0 I2 0 1 2 3 4 5 I1 Simulation Time (T)
A 1 → 1 A, B 2 → 1 Steady State: Inputs (PA, PB) Results (C1, C2) (½, ½) (PA ≥ PB) (PA–PB, PB) (PA ≤ PB) (0, (PA+PB)/2) Resolving Pattern Interactions Network Configuration Steady State Results Node→Value 1 2 Inputs Node : Input: A B (0, ½) A=1 and B = ½ ( Half Activation Half Response =1, =½) A=.12 C1=.12 A=2.5 B=1 c1=1.5 c2=1
Inputs Results Node→Value A 2 → ½ A, B 2 → 1 A, B, C 2,3 →¾ B, C B 3 → 1 2,3 →¼ Resolving Pattern Interactions Based on Available Representations 2 3 Cells : Inputs: A B C
Inputs ResultsCell Value A 1 → 1 A, B 2 → 1 A, B, C 1,3 → 1 B, C 3 → 1 1 2 3 Cells : Inputs: A B C ‘Binding’ Resolving Pattern Interactions Most efficient configuration Not possible with OvA or AvA
Can be Chained Ad Infinitum 2 3 N Nodes ... Inputs: A B C N O N 1 2 3 Nodes : ... Inputs: N O A B C
New data: Recognize Scene When Trained on Individuals • Teach single letters • Test multiple simultaneous letters • A scene is beyond the training distribution
x3 xn Feature Examples: Feature 1 Feature 2 Feature 3 Feature 4 … Feature n x.. x2 I2 I.. x4 x1 I4 I1 Feature 1 = x1 Feature 2 Feature 3 Feature Extraction • Bag-of-features 512 features • Found in Visual Cortex • Pixels Separated into features Inputs to Model I3 In Feature Examples: Feature 1 Feature 2 Feature 3 Feature 4 … Feature n Feature 1 Feature 2 Feature 3 DX Fig 4 Simple feature extractor presenting non-spatial information from visual field: collective pixel patterns presented to network
100 NN 90 RFNN 80 70 60 50 % of combinations 40 30 20 10 0 0/5 1/5 2/5 3/5 4/5 5/5 Letters Correctly Classified Figure 7: Five Letter Classification NN SVM KNN IFN Two Stimuli Simultaneously A B 100 90 80 70 60 % of combinations 50 40 30 20 10 0 0/2 1/2 2/2 Letters Correctly Classified Figure 5: NN with two letter retraining
NN SVM KNN IFN A B C D Four Stimuli Simultaneously: 100 i.e. (A B C D) 90 80 70 60 50 % of combinations 40 30 20 10 0 0/4 1/4 2/4 3/4 4/4 Letters Correctly Classified Figure 6: Four Letter Classification Figure 5: NN with two letter retraining
Difficulty • Nonlinear Equations • Can’t mathematically prove general properties
Steps Towards AGI • Generalize Outside Training Distribution • Structure Avoids Combinatorial Explosion
Acknowledgements Cyrus Omar National Geospatial-Intelligence Agency HM1582-06--BAA-0001
Equations Y ( t ) å å = Q Y ( t ) + D = a Y ( t t ) I b j a i n Î Î j M i N a b a æ ö X ç ÷ = b I ç ÷ Y ( t ) X b å Q + D = a i Y ( t t ) b ç ÷ a n å ç ÷ Î i N Y ( t ) a a ç ÷ j è ø Î j M i Activation Feedback Inhibition Combined: C collection of all output cells Ca cell “a”. Na the set of input connections to cell Ca. na the number of processes in set Na of cell Ca. P primary inputs (not affected by shunting inhibition). I collection of all inputs Ibinput cell “b”. Mb the set of recurrent feedback connections to input Ib. mb the number of connections in set Mb Q shunting inhibition. Qb shunting inhibition at input b.