1.25k likes | 2.66k Views
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING. CSCI 8810 Course Project. By Kaan Tariman M.S. in Computer Science. Outline. Introduction to Machine Learning The example application Machine Learning Methods Decision Trees Artificial Neural Networks Instant Based Learning.
E N D
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING CSCI 8810 Course Project By Kaan Tariman M.S. in Computer Science
Outline • Introduction to Machine Learning • The example application • Machine Learning Methods • Decision Trees • Artificial Neural Networks • Instant Based Learning
What is Machine Learning • Machine Learning (ML) is constructing computer programs that develop solutions and improve with experience • Solves problems which can not be solved by enumerative methods or calculus-based techniques • Intuition is to model human way of solving some problems which require experience • When the relationships between all system variables is completely understood ML is not needed
System … … A Generic System Input Variables: Hidden Variables: Output Variables:
Learning Task • Face recognition problem: Whose face is this in the picture? • Hard to model describing face and its components • Humans recognize with experience: The more we see the faster we perceive.
The example • Vision module for Sony Aibo Robots that we have developed for Legged Robot Tournament in RoboCup 2002. • Output of the module is distance and orientation of the target objects: • the ball, • the players • the goals • the beacons - used for localization of the robot.
Main ML Methods • Decision Trees • Artificial Neural Networks (ANN) • Instant-Based Learning • Bayesian Methods • Reinforcement Learning • Inductive Logic Programming (ILP) • Genetic Algorithms (GA) • Support Vector Machines (SVM)
Decision Trees • Approximation of discrete functions by a decision tree. • In the nodes of trees are attributes and in the leaves are values of discrete function • Ex: A decision tree for “play tennis”
Algorithm to derive a tree • Until each leaf node is populated by as homogeneous a sample set as possible: • Select a leaf node with an inhomogeneous sample set. • Replace that leaf node by a test node that divides the inhomogeneous sample set into minimally inhomogeneous subsets, according to an entropy calculation.
Color Classification • Data set includes pixel values labeled with different colors manually • The tree classifies a pixel to a color according to its Y,U,V values. • Adaptable for different conditions.
How do we construct the data set? 1) Open an image taken by the robot
How do we construct the data set? 2) Label the pixels with colors [Y,U,V,color] entries are created for each pixel labeled
How do we construct the data set? 3) Use the ML method and display results
The decision tree output • The data set is divided into training and validation set • After training the tree is evaluated with validation set. • Training should be done carefully, avoiding bias.
Artificial Neural Networks (ANN) • Made up of interconnected processing elements which respond in parallel to a set of input signals given to each
ANN Algorithm • Training algorithm adjusts the weights reducing the error between the known output values and the actual values • At first, the outputs are arbitrary. • As cases are reintroduced repeatedly ANN gives more right answers. • Test set is used to stop training. • ANN is validated with unseen data (validation set)
Face Recognition with ANN • Problem: Orientation of face • Input nodes are pixel values of the image. (32 x 32) • Output has 4 nodes (right, left, up, straight) • 6 hidden nodes
Face Recognition with ANN • Hidden nodes normally does not infer anything, in this case we can observe some behavior.
Instance Based Learning • A learn-by-memorizing method: K-Nearest Neighbor • Given a data set {Xi, Yi} it estimates values of Y for X's other than those in the sample. • The process is to choose the k values of Xi nearest the X and average their Y values. • Here k is a parameter to the estimator. The average could be weighted, e.g. with the closest neighbor having the most impact on the estimate.
KNN facts • Database of knowledge about known instances is required – memory complexity • “Lazy learning”, no model for the hypothesis • Ex: Color classification • A voting method is applied in order to output a color class for the pixel.
Summary • Machine Learning is an interdisciplinary field involving programs that improve by experience • ML is good for pattern recognition, object extraction and color classification etc. problems in image processing problem domain. • 3 methods are considered: • Decision Trees • Artificial Neural Networks • Instant Based Learning