120 likes | 274 Views
Hybrid Systems. Two examples of the Combination of Rule-Based Systems and neural nets By Pieter Buzing. Plan. Introduction: Knowledge Based System vs Neural Net Basic hybrid technique Fu’s system KBANN system Comparison (rule improvement, semantics) Conclusions. Introduction.
E N D
Hybrid Systems Two examples of the Combination of Rule-Based Systems and neural nets By Pieter Buzing
Plan • Introduction: • Knowledge Based System vs Neural Net • Basic hybrid technique • Fu’s system • KBANN system • Comparison (rule improvement, semantics) • Conclusions
Introduction • Knowledge Based System • Neural Network • Characteristics KBS & NN • Basic hybrid technique
Knowledge Based System • Rule base and fact base • Facts Conclusions • Certainty Factors [-1, 1] • IF smart AND ambitious rich • Given: CF(smart)=0.8 CF(ambitious)=0.5 • Conclude: CF(rich)=0.7*min(0.8, 0.5)=0.35 INFERENCE CF=0.7
Neural Network • Nodes and connections • layers: input, hidden and output nodes • Aim: right weight vector for each connection • Trained with examples: minimize error
Basic hybrid technique Initialize the neural network with domain knowledge. So architecture and initial weight are now founded! Use the following mapping:
Fu’s system (1989) • Proposed by: Li-Min Fu, Winsconsin • Objective: let NN deal with incorrect KB • Construction: conceptual network with CFs AND-nodes to maintain meaning • Training: backpropagation and hill-climbing because AND-function not differentiable • Error handling: identifies wrong rules • Semantics: rules always ‘visible’ in network
KBANN system (1994) • Proposed by: Towell&Shavlik, Winsconsin • Objective: use KB to initialize a NN • Construction: conceptnode extra nodes and connections added • Training: backpropagation • Error handling: weight adjustment • Semantics: too many connections to make sense out of it
Comparison (1) Coping with erroneous rules Fu considers rule incorrect when weight change exceeds a threshold KBANN deals with it implicitly, it alters the weight of a inconsistent rule Fu can identify malicious rules when 12% of rules is corrupted KBANN: outperforms standard NN with 10% big or 30% small changes
Comparison (2) Maintainability of semantics Fu: every unit keeps its meaning KBANN: (random) units are added Fu: conjunction units hold their original semantic basis KBANN: all nodes are connected, so every node is a big ‘conjunction’ Fu’s weights are CFs. KBANN?
Conclusions • Coping with erroneous rules • Fu can be used to verify rules: identify inconsistent ones. • KBANN handles it convincingly • Maintainability of semantics • Fu succeeds in comprehensibility goal • KBANN loses its semantics: mere starting base • Mind you: different goals, periods, domain