610 likes | 946 Views
Object Recognition. By A.Sravya. Object recognition problem. Given some knowledge of how certain objects may appear and an image of a scene possibly containing those objects, report which objects are present in the scene and where. Applications. Image panoramas Image watermarking
E N D
Object Recognition By A.Sravya
Object recognition problem • Given some knowledge of how certain objects may appear and an image of a scene possibly containing those objects, report which objects are present in the scene and where.
Applications • Image panoramas • Image watermarking • Global robot localization • Face Detection • Optical Character Recognition • Manufacturing Quality Control • Content-Based Image Indexing • Object Counting and Monitoring • Automated vehicle parking systems • Visual Positioning and tracking • Video Stabilization
Introduction • Pattern or Object: Arrangement of descriptors(features) • Pattern class: Family of patterns that share some common properties • Pattern Recognition: Techniques for assigning patterns to their respective classes • Common pattern arrangements: 1. vectors – ( for quantitative descriptors) 2. strings 3. trees – (for structural descriptors) • Approaches to pattern recognition • Decision – theoretic quantitative descriptors • Structural qualitative descriptors
where xi represents the ith descriptor n is no: of descriptors associated with the pattern Example : Consider 3 types of iris flowers- setosa,virginica and versicolor Each flower is described by petal length and width . Therefore the pattern vector is given by:
Patterns and pattern classesanother vector example • Here is another example of pattern vector generation. • In this case, we are interested in different types of noisy shapes.
Strings and trees • Recognition problems in which not only quantitative measures about each feature but also the spatial relationships between them determine class membership,are solved by structural approach Example: Fingerprint recognition • Strings • String descriptions generate patterns of objects whose structure is based on relatively simple connectivity of primitives usually associated with boundary shape
String example • String of symbols w =……..abababab……….
Trees • Tree descriptors more powerful than strings • Most hierarchical ordering schemes lead to tree structures Example:
Recognition based on Decision-Theoritic Methods Based on the use of decision functions ( d(x) ) Here we find W decision functions d1(x), d2(x),....... dW(x) with the property that, if a pattern x belongs to class ωi , then 1 The decision boundary separating class and is given by Now the objective is to develop various approaches for finding decision functions that satisfy Eq(1)
Decision theoritic methods- matching • Here we represent each class by a prototype pattern vector • An unknown pattern is assigned to the class to which it is closest in terms of a predefined approach • The two approaches are: • Minimum distance classifier – calculate the Euclidean distance • correlation
Minimum distance classifier Prototype pattern vector Calculate the Euclidean distance between the unknown vector and the prototype vector Distance measure is the decision function …….large numerical value
Contd.. Decision boundary b/w classes and is ….perpendicular bisector If dIj(x) > 0, then x belongs to IfdIj(x) < 0, then x belongs to
Matching by correlation • Correlation is used for finding matches of a sub image w(x,y) of size J X K within an image f(x,y) of size M X N • Correlation between w(x,y) and f(x,y) is given by
Contd.. • The maximum values of c indicates the positions where w best matches f
Optimum statistical classifiers • This is a probabilistic approach to pattern recognition • Average loss The classifier that minimizes the total average loss is called the Bayes classifier
Optimum statistical classifiers Bayes classifier assigns an unknown pattern x to class if Loss for a correct decision is assigned ‘0’ and for incorrect decision ‘1’
Optimum statistical classifiers Further simplified to Finally ….Bayes Decision Function BDF depends on the pdfs of the patterns in each class and the probability of occurrence of each class Sample patterns are assigned to each class and then necessary parameters are estimated Most commonly used form for is the Gaussian pdf
Bayesian classifier for guassian pattern classes Bayes decision function for Gaussian pattern classes is here n = 1 & W = 2
Bayesian classifier for guassian pattern classes • In n-dimensional case Bayesian decision function for gaussian pattern classes under 0-1 loss function
Bayesian classifier for guassian pattern classes • BDF reduces to minimum distance classifier if: • Pattern classes are Gaussian • All covariance matrices are equal to the identity matrix 3. All classes are equally likely to occur • Therefore minimum distance classifier is optimum in Bayes sense if the above conditions are satisfied
Neural Networks Neural network: information processing paradigm inspired by biological nervous systems, such as our brain Structure: large number of highly interconnected processing elements (neurons) working together Neurons are arranged in layers
Neural Networks Each neuron within the network is usually a simple processing unit which takes one or more inputs and produces an output. At each neuron, every input has an associated weight which modifies the strength of each input. The neuron simply adds together all the inputs and calculates output.
NN contd.. • Neurons: Elemental nonlinear computing elements • We use these networks for adaptively developing the coefficients of decision functions via successive presentations of training set of patterns • Training patterns: Sample patterns used to estimate desired parameters • Training set: Set of such patterns from each class • Learning or Training: Process by which a training set is used to obtain decision functions • Perceptronmodel basic model of a neuron • Perceptrons are learning machines
Training algorithms-linearly seperable classes then If ω2 and This algorithm makes a change in w only if the pattern being considered at the kthstep in the training sequence is misclassified
Training algorithms-Nonseperable classes • This method minimizes the error between the actual and the desired response From gradient descent algorithm
Training algorithms-Nonseperable classes Changing weights reduces the error by a factor
Multilayer feedforward neural networks • We focus on decision functions of multiclass pattern recognition problems, independent of whether the classes are separable or not
Multilayer feedforward neural networks • Activation element is a sigmoid function Input to the activation element of each node in layer J The outputs of layer K are The final sigmoid function is
Training by back propagation • We begin by concentrating on the output layer • The process starts with an arbitrary set of weights through out the network • Generalized delta rule has two basic phases: • Phase 1 • A training vector is propagated through the layers to compute the output Ojfor each node • The outputs Oqof the nodes in the output layer are then compared against their desired responses rp, to generate the error terms δq • Phase 2 • A backward pass through the network during which the appropriate error signal is passed to each node and the corresponding weight changes are made
Performance of a neural network as a function of noise level
Improvement in peformance by increasing no.of training patterns