220 likes | 347 Views
B. John Oommen A Joint Work with Luis G. Rueda School of Computer Science Carleton University. Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work. Optimization Problems. Any arbitrary optimization problem: Instances, drawn from a finite set, X,
E N D
B. John Oommen A Joint Work with Luis G. Rueda School of Computer Science Carleton University Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work
Optimization Problems • Any arbitrary optimization problem: • Instances, drawn from a finite set, X, • An Objective function • Some feasibility functions • The aim: • Find an (hopefully the unique) instance of X, • which leads to a maximum (or minimum) • subject to the feasibility constraints.
An Example • The Traveling Salesman Problem (TSP) • Consider the cities numbered from 1 to n, • The salesman starts from city 1, • visits every city once, and • Returns to city 1. • An instance of X is a permutation of cities: • For example, 1 4 3 2 5, if five cities considered • The objective function: • The sum of the inter-city distances: • 1 4, 4 3, 3 2, 2 5, 5 1
Heuristic Functions A Heuristic algorithm is an algorithm which attempts to find a certain instance X that maximizes the objective function It iteratively invokes a Heuristic function. The heuristic function estimates (or measures) the cost of the solution. The heuristic itself is a method that performs one or more changes to the current instance.
An Open Problem Consider a Heuristic algorithm that invokes any of Two Heuristic Functions : H1 and H2 • used in estimating the solution to an • Optimization problem • If Estimation accuracy of H1 > • Estimation accuracy of H2 Does it imply that • H1 has higher probability of leading to the optimal QEP?
where , and c Pattern Recogniton Modeling Two heuristic functions : H1 and H2 Probability of choosing a cost value of a Solution: two independent random variables: X1 and X2 Distribution -- doubly exponential:
Pattern Recogniton Modeling Our model: Error function is doubly exponential. Typical in reliability analysis and failure models. How reliable is a Solution when only estimate known? Assumptions: Mean cost of Optimal Solution: , then • shift the origin by E[X] = 0 • Variances: • Estimate X1 better than Estimate of X2
then : Main Result (Exponential) • H1 and H2, two heuristic functions. • X1 and X2, two r.v. optimal solution obtained by H1 and H2 • X1’ and X2’, other two r.v. for sub-optimal solution • Let p1 and p2 the prob. that H1 and H2 respectively make the wrong decision. Shown that:
Proof (Graphical Sketch) For a particular x, the prob. that x leads to wrong decision by H1 is given by: X1(subopt) X1(opt) X2(opt) X2(subopt)
or X1(subopt) X1(opt) X2(opt) X2(subopt) if x < c Proof (Cont’d)
Proof (Cont’d) The total probability that H1 makes the wrong decision for all values of x is: Similarly, the prob. that H2 makes the wrong decision for all values of x is:
Proof (Cont’d) Solving integrals and making p1 p2, we have: which, using ln x x - 1, implies that p1 p2 QED • where 1=1c and 2=2c • Also 2substituted for k1
and and Second Theorem F(a1,k) can also be written in terms of a1 and k as: • Suppose that a1 0and0 k 1, • then G(a1,k) 0, and • there are two solutions for G(a1,k) = 0 Proof: Taking partial derivatives and solving:
R-ACM / Eq-width R-ACM / Eq-depth T-ACM / Eq-width T-ACM / Eq-depth G >>> 0, or p1 <<< p2 R-ACM / T-ACM Eq-width / Eq-depth G 0, or p1p2 Minimum in a1 = 0 and 0 k 1 Graphical Analysis (Histograms)
Analysis : Normal Distn’s No integration possible for the normal pdf Shown numerically that p1 p2
l is estimated as where N is the # of samples l Estimation for Histograms
Estimated for RACM True d-Exp Similarities of R-ACM and d-Exp
Simulations Details Simulations performed in Query Optimization: • 4 independent runs per simulation. • 100 random Databases per run 400 per simulation. • 6 Relations, • 6 Attributes per relation, • 100 tuples per relation. • Four independent runs on 100 databases: R-ACM vs. Traditional using: 11 bins, 50 values
Empirical Results # of times in which R-ACM yields better QEP # of times in which Eq-width yields better QEP # of times in which Eq-depth yields better QEP
Conclusions • Applied PR Techniques to solve problem of relating Heuristic Function Accuracy and Solution Optimality • Used a reasonable model of accuracy (doubly exponential distribution). • Shown analytically how the high accuracy of heuristic function leads to a superior solutions. • Numerically shown the results for normal distributions • Shown that R-ACM yield better QEPs in a larger number of times than Equi-width and Equi-depth. • Empirical results on randomly generated databases also shown the superiority of R-ACM. • Graphically demonstrated the validity of our model.