280 likes | 441 Views
A Technique for Advanced Dynamic Integration of Multiple Classifiers. Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence and Information Systems, Kharkov State Technical University of Radioelectronics, UKRAINE
E N D
A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence and Information Systems, Kharkov State Technical University of Radioelectronics, UKRAINE e-mail: vagan@kture.cit-ua.net, vagan@jytko.jyu.fi **Department of Computer Science and Information Systems, University of Jyvaskyla, FINLAND, e-mail: sepi@jytko.jyu.fi STeP’98 - Finnish AI Conference, 7-9 September, 1998
Metaintelligence Laboratory: Research Topics • Knowledge and metaknowledge engineering; • Multiple experts; • Context in Artificial Intelligence; • Data Mining and Knowledge Discovery; • Temporal Reasoning; • Metamathematics; • Semantic Balance and Medical Applications; • Distance Education and Virtual Universities.
Contents • What is Knowledge Discovery ? • The Multiple Classifiers Problem • A Sample (Training) Set • A Sliding Exam of Classifiers as Learning Technique • A locality Principle • Nearest Neighbours and Distance Measure • Weighting Neighbours, Predicting Errors and Selecting Classifiers • Data Preprocessing • Some Examples
What is Knowledge Discovery ? • Knowledge discovery in databases (KDD) is a combination of data warehousing, decision support, and data mining and it is an innovative new approach to information management. • KDD is an emerging area that considers the process of finding previously unknown and potentially interesting patterns and relations in large databases*. • __________________________________________________________________________________________________________________________________________ • * Fayyad, U., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R., Advances in Knowledge Discovery and Data Mining, AAAI/MIT Press, 1996.
The Research Problem During the past several years, in a variety of application domains, researchers in machine learning, computational learning theory, pattern recognition and statistics have tried to combine efforts to learn how to create and combine an ensemble of classifiers. The primary goal of combining several classifiers is to obtain a more accurate prediction than can be obtained from any single classifier alone.
Approaches to Integrate Multiple Classifiers Integrating Multiple Classifiers Combination Selection Decontextualization Local (“Virtual” Classifier) Local (Dynamic) Global (Static) Global (Voting-Type)
Classification Problem J classes, n training observations, p object features Given: n training pairs (xi, yi) with xiÎRp and yiÎ{1,…,J} denoting class membership Goal: given: new x0 select classifier for x0 predict class y0
Classifiers Used in Example • Classifier 1: LDA - Linear Discriminant Analysis; • Classifier 2: k-NN - Nearest Neighbour Classification; • Classifier 3: DANN - Discriminant Adaptive Nearest Neighbour Classification
ASliding Examof Classifiers (Jackknife Method):We apply all the classifiers to the Training Set points and check correctness of classification
Selecting Amount of Nearest Neighbours • A suitable amountlof nearest neighbours for a training set point should be selected, which will be used to classify case related to this point. • We have usedl = max(3, n div 50) for all training set points in the example, where n is the amount of cases in a training set. • ? Should we locally select an appropriatelvalue ?
Brief Review of Distance FunctionsAccording to D. Wilson and T. Martinez (1997)
Selection of a Classifier DANN should be selected
Features Used in Dystonia Diagnostics • AF (x1) - attack frequency; • AM0 (x2) - the mode, the index of sympathetic tone; • dX (x3) - the index of parasympathetic tone; • IVR (x4) - the index of autonomous reactance; • V (x5) - the velocity of brain blood circulation; • GPVR (x6) - the general peripheral blood-vessels’ resistance; • RP (x7) - the index of brain vessels’ resistance.
Experiments with Heart Disease Database • Database contains 270 instances. Each instance has 13 attributes which have been extracted from a larger set of 75 attributes. The average cross-validation errors for the three classification methods were the following: DANN 0.196, K-NN 0.352, LDA 0.156, Dynamic Classifier Selection Method 0.08
Experiments with Liver Disorders Database • Database contains 345 instances. Each instance has 7 numerical attributes. The average cross-validation errors for the three classification methods were the following: DANN 0.333, K-NN 0.365, LDA 0.351, Dynamic Classifier Selection Method 0.134
Experimental Comparison of Three Integration Techniques Local (Dynamic) Classifier Selection (DCS) is compared with Voting and static Cross-Validation Majority
Conclusion and Future Work • Classifiers can be effectively selected or integrated due to the locality principle • The same principle can be used when preprocessing data • The amount of nearest neighbours and the way of distance measure it is reasonable decided in every separate case • The difference between classification results obtained in different contexts can be used to improve classification due to possible trends