1 / 35

Presented by Peter Duval

Persistent. S1 Fast Strategy. Inevitable Selector. EM. Rule Induction with Extension Matrices Dr. Xindong Wu Journal of the American Society for Information Science VOL. 49, NO. 5, 1998. HFL. Redundant Selector. path. EMD. S2 Precedence Strategy. PE. Presented by Peter Duval.

afra
Download Presentation

Presented by Peter Duval

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Persistent S1 Fast Strategy Inevitable Selector EM Rule Induction with Extension MatricesDr. Xindong WuJournal of the American Society for Information ScienceVOL. 49, NO. 5, 1998 HFL Redundant Selector path EMD S2 Precedence Strategy PE Presented by Peter Duval Eliminable Selector NEM MCV HCV Intersecting Group NE MFL S3 Elimination Strategy S4 Least Frequency Selector

  2. Context • HFL/HCV presents an alternative to decision trees with rule induction. • HCV can be used as a benchmark for rule induction.

  3. Context • This paper condenses Dr. Wu’s Ph.D. dissertation on the extension matrix approach and HFL/HCV algorithm. • Look to University of Illinois, J. Wong and R.S. Michalski for work leading to HFL/HCV. • HFL/HCV appears to be underrepresented in literature citations.

  4. Overview • Represent the negative training data as row vectors in a matrix. • Process positive examples as they come to eliminate uninformative attributes in the negative examples. • Read conjunctive rules from the resulting matrix. • Simplify and cleanup the rules.

  5. Positive and Negative Examples • A positive example (PE): • A negative example (NE):

  6. Negative example matrix (NEM) Gather the negative examples as row vectors in the NEM:

  7. Positive Example (PE) Against NEM A positive example written as a row vector:

  8. Extension Matrices • Delete any matching elements in the NEM

  9. Extension Matrices • We construct one Extension Matrix per Positive Example

  10. Extension Matrices Let’s make a second extension matrix:

  11. Extension Matrices The second extension matrix:

  12. Extension Matrices Finally let’s make a third extension matrix:

  13. Extension Matrices The third extension matrix:

  14. Dead Elements • Dead Elements, *, take the place of attributes that fail to distinguish the negative example from the corresponding positive example.

  15. Matrix Disjunction (EMD) • If there exists a dead element in any position of the extension matrices, the EMD will have a dead element there, too. “OR” the dead elements

  16. Partitions • Once a dead row would be created, start a new EMD. Partition 1 Partition 2 …

  17. Matrix Disjunction (EMD) • Let’s construct the EMD using just the first two Extension Matrices. “OR” the dead elements

  18. Matrix Disjunction (EMD) • The EMD has dramatically reduced the amount of superfluous information. “OR” the dead elements

  19. Paths • Choose one non-dead element from each row. This is called a path. • We can create paths in EMs and EMDs.

  20. Path Cover ≡ Conjunctive Formula • The path corresponds to a conjuctive formula expressed in variable-valued logic.

  21. Path = Cover ≡ Conjunctive Formula

  22. HFL Wu developed HFL to find good rules. An algorithm with 4 strategies, it finds a compact disjunction of conjunctions: • Fast • Precedence • Elimination • Least Frequency

  23. HFL Strategies: Fast X3≠1 covers all negative examples. X3≠1 => positive class. We can stop processing.

  24. HFL Strategies: Precedence • [X1≠1] and [X3≠1] are inevitable selectors. • Record conjunction and label the rows as covered. • Below, a path is formed. All rows are covered. We are done.

  25. HFL Strategies: Elimination • Redundant selectors in attribute X2 can be eliminated because non-dead X3 values cover all of the rows covered by X2. • All elements in column X2 become dead elements. 

  26. HFL Strategies: Least Frequency • Attribute X1 selectors are least frequent and can be eliminated. • Other strategies must be applied before applying Least Frequency again. 

  27. HCV Algorithm • HCV improves HFL: • Partition the positive examples into intersecting groups. • Apply HFL on each partition • OR the conjunctive formulae from each partition. Well described in: http://www.cs.uvm.edu/~xwu/Publication/JASIS.ps See Wu’s 1993 Ph.D dissertation for more background: http://www.era.lib.ed.ac.uk/bitstream/1842/581/3/1993-xindongw.pdf

  28. HCV Software • Features many refinements and switches • Works with C4.5 data. • Can be run through a web interface:HCV Online Interface • Is described in Appendix A of Wu’s textbook, and online:HCV Manual

  29. Golf Rules for the 'Play' class (Covering 3 examples): The 1st conjunctive rule: [ temperature != { cool } ] ^ [ outlook != { sunny } ] --> the 'Play' class (Positive examples covered: 3) Rules for the 'Don't_Play' class (Covering 4 examples): The 2nd conjunctive rule: [ outlook != { overcast } ] ^ [ wind = { windy } ] --> the 'Don't_Play' class (Positive examples covered: 4) The total number of conjunctive rules is: 2 The default class is: 'Don't_Play' (Examples in class: 4) Time taken for induction (seconds): 0.0 (real), 0.0 (user), 0.0 (system) Rule file or preprocessed test file not found. Skipping deduction

  30. HCV • HCV is competitive with other decision tree and rule producing algorithms. • HCV generally produces more compact rules. • HCV outputs variable-valued logic. • HCV handles noise and discretization. • HCV guarantees a “conjunctive rule for a concept”.

  31. Ideas • Can HFL/HCV be applied to chess? Bratko did this with ID3. [Crevier 1993, 177] • How can HCV be parallelized? • How does the extension matrix approach work in closed-world situations? • Is HCV 2.0 a good candidate for automated parameter tuning by genetic algorithm or other evolutionary technique?

  32. The End. • Presentation based on slides by Leslie Damon. • Questions?

  33. Exam Questions • Definitions: Extension Matrix: a matrix of negative examples as row vectors, where, for a given positive example, elements that match the positive example are replaced with dead elements, denoted as ‘*’. Dead Element: an element of a negative example which cannot be used to distinguish a given positive example from the negative example. Path: a set of non-dead elements, one each from all of the rows of an extension matrix.

  34. Exam Questions • Four stages of HFL: • Fast: A single attribute value that covers all rows • Precedence: Favor attributes that are the only non-dead element of a row. • Elimination: Get rid of redundant elements. • Least Frequency: Get rid of columns that cover where non-dead values cover the fewest rows. See slides labeled “HFL Strategies”

  35. Exam Questions • The Pneumonia/Tuberculosis problem is worked through in the paper and Leslie Damon’s slides. Here is the EMD:

More Related