260 likes | 410 Views
GEOMETRY IN PERCEPTRON LEARNING. Reference: GEOMETRY IN LEARNING, tech. report by KRISTIN P. BENNETT AND ERIN J. BREDENSTEINER, RENSSELAER POLYTECHNIC INSTITUTE IE 5970 SPRING, 2,000. PRESENTATION OUTLINE. 1. INTRODUCTION 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON
E N D
GEOMETRY IN PERCEPTRON LEARNING Reference: GEOMETRY IN LEARNING, tech. report by KRISTIN P. BENNETT AND ERIN J. BREDENSTEINER, RENSSELAER POLYTECHNIC INSTITUTE IE 5970 SPRING, 2,000
PRESENTATION OUTLINE 1. INTRODUCTION 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON 3. GEOMETRY OF A PERCEPTRON 4. TRAINING: LINEARLY SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM) 4.2. SECOND SIMPLIFICATION: THE OPTIMAL PLANE 5. TRAINING: LINEARLY INSEPARABLE CASE 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP) 5.2. COMBINATIONS OF MSM AND RLP: GENERALIZED OPTIMAL PLANE (GOP) 5.3. COMBINATIONS OF MSM AND RLP: PERTURBED ROBUST LINEAR PROGRAMMING (RLP-P) 6. APPLICATIONS 7. COMPUTATIONAL RESULTS 8. CONCLUSION 9. REMARKS
1. INTRODUCTION • CLASSIFICATION PROBLEM (TWO CLASSES A, B) • IDENTIFY ELEMENTS OF EACH CLASS (EXAMPLE: CANCER DIAGNOSIS, TUMOR BENIGN OR MALIGNANT?) • EACH ELEMENT OF CLASS A OR B IS DESCRIBED BY A VECTOR (N ATTRIBUTES, EXAMPLE: PATIENT’S AGE, BLOOD PRESSURE, SMOKING HABITS) • TRAINING PHASE: CLASS IS KNOWN, FUNCTION F(X) IS CONSTRUCTED • TESTING PHASE: CLASS IS NOT KNOWN, F(X) CLASSIFIES FUTURE POINTS
2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON • PERCEPTRON IS TYPE OF CLASSIFICATION FUNCTION (MOTIVATED BIOLOGICALLY) • PERCEPTRON IS STIMULATED BY (n - DIMENSIONAL) INPUT VECTORS (n ATTRIBUTES, CAN BE SEEN AS COORDINATES)
3. GEOMETRY OF A PERCEPTRON GEOMETRIC INTERPRETATION:
4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) PRIMAL PROBLEM:
4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) GRAPHICAL INTERPRETATION OF THE PRIMAL PROBLEM:
4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) DUAL PROBLEM:
4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) GRAPHICAL INTERPRETATION OF THE DUAL PROBLEM:
4. TRAINING: LINEAR SEPARABLE CASE 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM)
4. TRAINING: LINEAR SEPARABLE CASE 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM)
4. TRAINING: LINEAR SEPARABLE CASE 4.2. SECOND SIMPLIFICATION: THE OPTIMAL PLANE
5. TRAINING: LINEARLY INSEPARABLE CASE • General approach: minimize misclassification error • Start from the Multisurface Method (MSM)
5. TRAINING: LINEARLY INSEPARABLE CASE • 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP) • minimize the sum of the misclassification errors
5. TRAINING: LINEARLY INSEPARABLE CASE 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP)
5. TRAINING: LINEARLY INSEPARABLE CASE 5.2. COMBINATIONS OF MSM AND RLP: GENERALIZED OPTIMAL PLANE (GOP)
5. TRAINING: LINEARLY INSEPARABLE CASE 5.3. COMBINATIONS OF MSM AND RLP: PERTURBED ROBUST LINEAR PROGRAMMING (RLP-P)
6. APPLICATIONS • Determination of heart diseases • Diagnosis of breast cancer • Voting patterns of congressmen to determine party affiliation • Using sonar signals to distinguish between mines and rocks
7. COMPUTATIONAL RESULTS USING MINOS (The sonar data set is completely separable.)
8.CONCLUSION • Perceptron classifies points from two sets • Correct classification only possible in the separable case • Optimization models for training a perceptron were presented • Experiments showed best performance for Generalized Optimal Plane (GOP) and Perturbed Robust Linear Programming (RLP-P)
9. REMARKS • Interesting connection between learning processes and geometry • Limitation to two classes • For RLP-P and GOP the right selection of the parameter lambda • is very important