1 / 23

TEAM HOMEWORK #9 Evolving an XOR Network

TEAM HOMEWORK #9 Evolving an XOR Network. Dr. Roger S. Gaborski. Modified Teams.

avak
Download Presentation

TEAM HOMEWORK #9 Evolving an XOR Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TEAM HOMEWORK #9Evolving an XOR Network Dr. Roger S. Gaborski Roger S. Gaborski

  2. Modified Teams • Team 1 Carpenter (Taylor) and EylerTeam 2 Mesh and SenTeam 3 Carpenter (Michael) and CooperTeam 4 Dean and Koon, Team 5 Bravo and Patel, Team 6 Hu and Louzolo-Kimbembe, Team 7 Goetz and SchulzeTeam 8 Kamat and Rajkumar, Team 9 Jain and PaulTeam 10 Sivov and  SmithTeam 11Stokes and MillerTeam 12 Powar and MurphyTeam 13 Sasarak and Sun Roger S. Gaborski

  3. 2 Neuron XOR Network • Two inputs applied to both neurons • Both neurons have bias terms Roger S. Gaborski

  4. 2 Neuron XOR Network with BIAS w10 w20 w22 w11 w12 N1 N2 x2 Notation: wto, from W1,2 To neuron 1 From neuron 2 w21 x1 w1in1 w1in2 w2in2 w2in1 Iin1 Iin2 Roger S. Gaborski

  5. Weight Matrix w1in1 w1in2 w10 w11 w12 Iin1 input1 w2in1 w2in2w20 w21 w22 * Iin2 input2 1 bias x1 output N1 x2 output N2 Roger S. Gaborski

  6. Weight Matrix – no self connections (w11 , w22 )no recurrent feedback connections (w12) w1in1 w1in2 w100 0 Iin1 input1 w2in1 w2in2w20 w210 * Iin2 input2 1 bias x1 output N1 x2 output N2 Roger S. Gaborski

  7. Weight Matrix Valuesno self connections (w11 , w22 )no recurrent feedback connections (w12) -2.19 -2.20 .139 0 0 Iin1 -2.81 -2.703.90 -31.8 0 * Iin2 1 x1 x2 Roger S. Gaborski

  8. Non-linear Fuction, State Update -2.19 -2.20 .139 0 0 Iin1 -2.81 -2.703.90 -31.8 0 * Iin2 1 = x1 x1 x2 x2 tanh Roger S. Gaborski

  9. Two Time Cycles Required • 1. Apply inputs, calculates update X vector. x1 is correct at this time, but x2 was calculated with x1=0. • 2. Update x1 and x2 • Apply same inputs second time, this time x2 is correct. • Output x2 Roger S. Gaborski

  10. Initial Weight Matrix with random numbers, [ -β, +β] r1 r2 r30 0 Iin1 r4 r5r6 r7 0 * Iin2 1 = x1 x1 x2 x2 sigmoid -After two passes of the input data calculate error between updated state matrix and correct values. Calculate total error based on all four input pairs. -Use one of the evolutionary methods discussed in class to find correct weights. Roger S. Gaborski

  11. Results for XOR err1 = 1.7605e-011 Iin1 = 1 Iin2 = 1 x2out = 1.92954e-021 err1 = 1.7605e-011 Iin1 = 1 Iin2 = 0 x2out = 1 err1 = 1.7605e-011 Iin1 = 0 Iin2 = 1 x2out = 1 err1 = 1.7605e-011 Iin1 = 0 Iin2 = 0 x2out = 3.92286e-020 Roger S. Gaborski

  12. bestError =2.5679e-015 Roger S. Gaborski

  13. BIIS Assignment #9 TEAM ASSIGNMENT • PART ONE: HOMEWORK REVISION (Re-write algorithm with new team members) • Correct/Update GA Solutions • Minimum Performance Results • 30 long vector : Score 30 • 1000 long vector : Score 1000 • 10000 long vector : Minimum score 8500 Roger S. Gaborski

  14. BIIS Assignment #9 TEAM ASSIGNMENT • PART TWO • Implement an algorithm that will evolve the weight matrix W for the XOR problem. • Follow guideline given, (see previous lectures , also) • Use the Genetic Algorithm you wrote and the following evolutionary algorithms discussed in class to solve the problem: • Simulated Annealing (TEAMS 1, 2, 3 and 4) • TabuSearch (TEAMS 5, 6 and 7) • Evolution Strategy ES(µ,λ) (TEAMS 8, 9 and 10 • Evolution Strategy ES(µ+λ) (TEAMS 11, 12and 13) • You must write your own programs • Program name:EvolveXOR_GAyourNames.m, EvolveXOR_TABUyournames.m, etc • Use same naming convention for other functions/scripts you may need • Compare the two methods in your writeup Roger S. Gaborski

  15. Output of program • Print to screen: • Error results for XOR inputs as shown on slide 10 • A plot of the error versus generation (slide 11) • Final weight matrix • Submit programs and a detailed analysis of your program and your results. Address questions, such as, did it always converge to the correct answer, how many generations, population size, how were matrices modified, etc. • Email to course account Roger S. Gaborski

  16. Additional Information4 Neuron Network, input to N1 and N2 only err1 = 1.71684e-008 Iin1 = 1 Iin2 = 1 x2out = 2.21501e-019 err1 = 1.71684e-008 Iin1 = 1 Iin2 = 0 x2out = 1 err1 = 1.71684e-008 Iin1 = 0 Iin2 = 1 x2out = 1 err1 = 1.71684e-008 Iin1 = 0 Iin2 = 0 x2out = 1.03917e-035 W = 15.3607 13.5668 -3.2710 0 0 0 0 -9.4263 8.1640 -6.8868 -0.1954 0 0 0 0 0 13.5103 7.9710 1.6977 0 0 0 0 0.7986 7.2134 16.4839 -16.9090 0 bestError = 2.2150e-019 Roger S. Gaborski

  17. Two Classes Roger S. Gaborski

  18. 4 Neuron Network – 2 Class Problem w40 w30 w10 w20 N1 N2 w32 N3 N4 w43 w21 x1 x2 x4 x3 w1in1 w1in2 w42 w31 w2in1 w41 w2in2 Iin1 Iin2 Roger S. Gaborski

  19. Roger S. Gaborski

  20. err1 = 0 Iin1 = -24.0107 Iin2 = -62.5499 out = 0 err1 = 0 Iin1 = -48.3297 Iin2 = -55.5969 out = 0 err1 = 0 Iin1 = -70.2611 Iin2 = -38.9464 out = 0 err1 = 0 Iin1 = -85.9289 Iin2 = -13.6098 out = 0 err1 = 0 Iin1 = -91.9457 Iin2 = 17.8724 out = 0 err1 = 0 Iin1 = -86.0025 Iin2 = 51.6755 out = 0 err1 = 0 Iin1 = -67.3373 Iin2 = 83.1546 out = 0 err1 = 0 Iin1 = -37.0062 Iin2 = 107.474 out = 0 err1 = 0 Iin1 = 2.10011 Iin2 = 120.315 out = 0 err1 = 0 Iin1 = 45.5127 Iin2 = 118.565 out = 0 err1 = 0 Iin1 = 24.0107 Iin2 = 62.5499 out = 1 err1 = 0 Iin1 = 48.3297 Iin2 = 55.5969 out = 1 err1 = 0 Iin1 = 70.2611 Iin2 = 38.9464 out = 1 err1 = 0 Iin1 = 85.9289 Iin2 = 13.6098 out = 1 err1 = 0 Iin1 = 91.9457 Iin2 = -17.8724 out = 1 err1 = 0 Iin1 = 86.0025 Iin2 = -51.6755 out = 1 err1 = 0 Iin1 = 67.3373 Iin2 = -83.1546 out = 1 err1 = 0 Iin1 = 37.0062 Iin2 = -107.474 out = 1 err1 = 0 Iin1 = -2.10011 Iin2 = -120.315 out = 1 err1 = 0 Iin1 = -45.5127 Iin2 = -118.565 out = 1 Roger S. Gaborski

  21. W = 26.4920 -2.1275 4.5634 0 0 0 0 -0.2518 0.4017 18.9732 -38.0695 0 0 0 0 0 9.8227 22.7548 -47.4532 0 0 0 0 -49.8367 102.8554 -256.5946 79.9826 0 bestError = 0 Roger S. Gaborski

  22. err1 = 0 Iin1 = -24.0107 Iin2 = -62.5499 out = 0 err1 = 0 Iin1 = -48.3297 Iin2 = -55.5969 out = 0 err1 = 0 Iin1 = -70.2611 Iin2 = -38.9464 out = 0 err1 = 0 Iin1 = -85.9289 Iin2 = -13.6098 out = 0 err1 = 0 Iin1 = -91.9457 Iin2 = 17.8724 out = 0 err1 = 0 Iin1 = -86.0025 Iin2 = 51.6755 out = 0 err1 = 0 Iin1 = -67.3373 Iin2 = 83.1546 out = 0 err1 = 0 Iin1 = -37.0062 Iin2 = 107.474 out = 0 err1 = 0 Iin1 = 2.10011 Iin2 = 120.315 out = 0 err1 = 0 Iin1 = 45.5127 Iin2 = 118.565 out = 0 err1 = 0 Iin1 = 24.0107 Iin2 = 62.5499 out = 1 err1 = 0 Iin1 = 48.3297 Iin2 = 55.5969 out = 1 err1 = 0 Iin1 = 70.2611 Iin2 = 38.9464 out = 1 err1 = 0 Iin1 = 85.9289 Iin2 = 13.6098 out = 1 err1 = 0 Iin1 = 91.9457 Iin2 = -17.8724 out = 1 err1 = 0 Iin1 = 86.0025 Iin2 = -51.6755 out = 1 err1 = 0 Iin1 = 67.3373 Iin2 = -83.1546 out = 1 err1 = 0 Iin1 = 37.0062 Iin2 = -107.474 out = 1 err1 = 0 Iin1 = -2.10011 Iin2 = -120.315 out = 1 err1 = 0 Iin1 = -45.5127 Iin2 = -118.565 out = 1 Roger S. Gaborski

  23. W = 32.4673 -3.1186 -28.6143 0 0 0 0 0.1022 0.7105 54.0334 -105.7453 0 0 0 0 0 24.4383 -25.4437 7.7179 0 0 0 0 15.4590 -1.5397 -150.7273 -7.5315 0 bestError = 0 Roger S. Gaborski

More Related