80 likes | 212 Views
Artificial Speciation and Automatic Modularisation. Neural Network Ensemble. A group of neural networks is used to solve a problem; each perhaps concentrating on part of the dataset The final output of the ensemble is determined by combining the output of individual NN using: Majority Voting
E N D
Neural Network Ensemble • A group of neural networks is used to solve a problem; each perhaps concentrating on part of the dataset • The final output of the ensemble is determined by combining the output of individual NN using: • Majority Voting • Simple Averaging • Weighted average • Evolutionary NNE: Ensemble = whole population/ a sub-population formed by clustering
Distance Measure • Distance between two NN, D(p,q), is measured by the cross-entropy: • Low D(p,q) means p and q are very similar
Fitness Sharing • Speciation at the phenotypic (behavior) level • Raw fitness: • Shared fitness: • The sharing function, S(D(p,q)), can be linear • Gaussian distribution in this case?
Negative Correlation Learning • Error of NN i for the nth training pattern • The penalty term
The Algorithm • Elitist: top portion with high fraw and the one with best fshare • Selection based shared fitness • Crossover: exchange subnet with same input and output • Mutation: add/delete connection between two randomly chosen nodes
Some Thoughts • Very high computational cost in calculating the cross-entropy and the fitness sharing function! • EDO • Fitness sharing among agents of the same parent? • Speciation between agents belonging to different lineage? • ERA?
References • V. Khare and X. Yao, Artificial Speciation and Automatic Modularisation, in Proc. of SEAL’02. • Y. Liu, X. Yao, T. Higuchi, Evolutionary Ensembles with Negative Correlation Learning, in IEEE Tran. Evolutionary Computation, 4(4), pp. 380-387, November, 2000.