1.28k likes | 2.07k Views
Soft Computing Methods. J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013. Outline. Fuzzy Sets Neural Nets Rough Sets Bayesian Nets Genetic Algorithms. Fuzzy sets. Fuzzy set theory is a means of specifying how well an object satisfies a vague description.
E N D
Soft Computing Methods J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013
Outline • Fuzzy Sets • Neural Nets • Rough Sets • Bayesian Nets • Genetic Algorithms
Fuzzy sets • Fuzzy set theory is a means of specifying how well an object satisfies a vague description. • A fuzzy set can be defined as a set with fuzzy boundaries • Fuzzy sets were first introduced by Zadeh (1965).
How do we represent a fuzzy set in a computer? First, the membership function must be determined.
Example • Consider the proposition "Nate is tall." • Is the proposition true if Nate is 5' 10"? • The linguistic term "tall" does not refer to a sharp demarcation of objects into two classes—there are degrees of tallness.
Fuzzy set theory treats Tall as a fuzzy predicate and says that the truth value of Tall(Nate) is a number between 0 and 1, rather than being either true or false.
Let A denote the fuzzy set of all tall employees and x be a member of the universe X of all employees. What would the function μA(x) look like
μA(x) = 1 if x is definitely tall • μA(x) = 0 if x is definitely not tall • 0 <μA(x) <1 for borderline cases
Classical Set • Fuzzy Set
Standard Fuzzy set operations • Complement cA(x) = 1 − A(x) • Intersection(A ∩ B)(x) = min [A(x), B(x)] • Union(A ∪ B)(x) = max [A(x), B(x)]
Linguistic variables and hedges • The range of possible values of a linguistic variable represents the universe of discourse of that variable. • A linguistic variable carries with it the concept of fuzzy set qualifiers, called hedges. Hedges are terms that modify the shape of fuzzy sets.
For instance, the qualifier “very” performs concentration and creates a new subset.(very, extremely) • An operation opposite to concentration is dilation. It expands the set.(More or less, somewhat)
Representation of hedges Hedge Mathematical Expression Graphical representation
Fuzzy logic is not logic that is fuzzy, but logic that is used to describe fuzziness. • Fuzzy logic deals with degrees of truth.
Building a Fuzzy Expert System • Specify the problem and define linguistic variables. • Determine fuzzy sets. • Elicit and construct fuzzy rules. • Perform fuzzy inference. • Evaluate and tune the system.
References [1]Artificial Intelligence (A Guide to Intelligent Systems) 2nd Edition by MICHAEL NEGNEVITSKY [2]An Introduction to Fuzzy Sets by WitoldPedrycz and Fernando Gomide [3]Fuzzy Sets and Fuzzy Logic: Theory and Applications by Bo Yuan and George J. [4]ELEMENTARY FUZZY MATRIX THEORY AND FUZZY MODELS FOR SOCIAL SCIENTISTS by W. B. VasanthaKandasamy [5]Wikipedia: http://en.wikipedia.org/wiki/Fuzzy_logic [6] Wikipedia: http://en.wikipedia.org/wiki/Fuzzy
References • http://www.softcomputing.net/fuzzy_chapter.pdf • http://www.cs.cmu.edu/Groups/AI/html/faqs/ai/fuzzy/part1/faq-doc-18.html • http://www.mv.helsinki.fi/home/niskanen/zimmermann_review.pdf • http://sawaal.ibibo.com/computers-and-technology/what-limits-fuzzy-logic-241157.html • http://my.safaribooksonline.com/book/software-engineering-and-development/9780763776473/fuzzy-logic/limitations_of_fuzzy_systems#X2ludGVybmFsX0ZsYXNoUmVhZGVyP3htbGlkPTk3ODA3NjM3NzY0NzMvMTUy
Thanks to • Ding Xu • EdwigeNounangNgnadjo For help with researching content and preparation of overheads on Fuzzy Sets
Artificial Neural Networks Neuron:basic information-processing units
Single neural network basic information-processing units
Active function • The Step and Sign active function, also named hard limit functions, are mostly used in decision-making neurons. • The Sigmoid function transforms the input, which can have any value between plus and minus infinity, into a reasonable value in the range between 0 and 1. Neurons with this function are used in the back-propagation networks. • The Linear activation function provides an output equal to the neuron weighted input. Neurons with the linear function are often used for linear approximation.
The Algorithm of single neural network • Step 1: Initialization Set initial weights w1,w2, . . . ,wnand threshold to random numbers in the range [-0.5,0.5]。 • Step 2: Activation • Step 3: Weight training • Step 4: Iteration Increase iteration p by one, go back to Step 2 and repeat the process until convergence.
How the machine learns Weight training
References • http://pages.cs.wisc.edu/~bolo/shipyard/neural/local.html. 2. Stuart J. Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. Prentice Hall, 2009. 3. http://www.roguewave.com/Portals/0/products/imsl-numerical-libraries/c-library/docs/6.0/stat/default.htm?turl=multilayerfeedforwardneuralnetworks.htm 4. Notes on Multilayer, Feedforward Neural Networks , Lynne E. Parker. 5.http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Why use neural networks
Thanks to • Hongming(Homer) Zuo • Danni Ren For help with researching content and preparation of overheads on Neural Nets
Rough Sets • Introduced by ZdzislawPawlak in the early 1980’s. • Formal framework for the automated transformation of data into knowledge. • Simplifies the search for dominant attributesin an inconsistent information table leading to derivation of shorter if-then rules.
Certain rules for examples are: (Temperature, normal) (Flu, no), (Headache, yes) and (Temperature, high) (Flu, yes), (Headache, yes) and (Temperature, very_high) (Flu, yes). Uncertain (or possible) rules are: (Headache, no) (Flu, no), (Temperature, high) (Flu, yes), (Temperature, very_high) (Flu, yes).
Strength of a Rule • Weights • Coverage: # elements covered by rule # elements in universe • Support: # positive elements covered by rule # elements in universe • Degree of certainty: support x 100 coverage
Attribute Reduction • Which are the dominate attributes? • How do we determine redundant attributes?
Indiscernibility Classes • An indiscernibility class, with respect to set of attributes X, is defined as a set of examples all of whose values for attributes x Є X agree • For example, the indiscernibility classes with respect to attributes X = {Headache, Temperature} are {e1}, {e2}, {e3}, {e4}, {e5, e7} and {e6, e8}
Defined by a lower approximation and an upper approximation The lower approximation is X = i xi The upper approximation is X= (i x) i
e5 e8 Lower and upper approximations of set X upper approximation of X Set X lower approximation of X e4 e7 e6 e1 e2 e3
If the indiscernibility classes with and without attribute A are identical then attribute A is redundant.
Mushroom Dataset Dataset contains 8124 entries of different mushrooms Each entry (mushroom) has 22 different attributes
Cap-shape Cap-surface Cap-color Bruises Odor Gill-attachment Gill-spacing Gill-size Gill-color Stalk-shape Stalk-root Stalk-surface-above-ring Stalk-surface-below-ring Stalk-color-above-ring Stalk-color-below-ring Veil-type Veil-color Ring-number Ring-type Spore-print-color Population Habitat 22 different attributes
almond anise creosote fishy foul musty none pungent spicy Soft Values for Attributes One of the attributes chosen is odor All the possible values are