50 likes | 81 Views
Explore maximizing graph features for effective classification - utilizing fingerprint, MACCS keys, tree patterns, frequent subgraphs, and fragments. Implement boosting with decision stumps and optimal subgraph mining for improved accuracy. Use Leap Search algorithm for optimal subgraph extraction based on user-specified objectives. Employ Leap Search to identify discriminative patterns efficiently.
E N D
Graph Classification SEG 5010 Week 3
A Summary of Graph Features • Fingerprint • Maccs keys • Tree and cyclic patterns • Frequent subgraphs • Graph fragments
A Boosting Approach to Graph Classification (NIPS04) • Apply boosting to graph classification • Weak learner: decision stump • Definition of the gain function • Learning the best weak learner mining the optimal subgraph • An upper bound of the gain function and branch-and-bound search
Leap Search (SIGMOD08) • The first study to mine the optimal subgraph given “general” user-specified objective functions • Vertical pruning: branch-and-bound • An objective function may not be anti-monotone, but its upper bound could be anti-monotone • Horizontal pruning: structural proximity • If two sibling branches are similar in structure, they may be similar in objective function scores • There is a lot of redundancy in the graph pattern search tree
COM (CIKM09) • Pattern co-occurrences: for effectiveness • Joint discriminative power of multiple graph patterns • Individual subgraphs are not discriminative, but their co-occurrences become discriminative • A different pattern exploration order: for efficiency • Complementary discriminative patterns are examined first • Generate patterns with higher scores before those with lower scores • Rule-based classifiers: a greedy generation process