520 likes | 769 Views
The Design and Analysis of Algorithms. Chapter 10: Iterative Improvement. Simplex Method. Iterative Improvement. Introduction Linear Programming The Simplex Method Standard Form of LP Problem Basic Feasible Solutions Outline of the Simplex Method Example Notes on the Simplex Method
E N D
The Design and Analysis of Algorithms Chapter 10:Iterative Improvement Simplex Method
Iterative Improvement • Introduction • Linear Programming • The Simplex Method • Standard Form of LP Problem • Basic Feasible Solutions • Outline of the Simplex Method • Example • Notes on the Simplex Method • Improvements
Introduction Algorithm design technique for solving optimization problems • Start with a feasible solution • Repeat the following step until no improvement can be found: • change the current feasible solution to a feasible solution with a better value of the objective function • Return the last feasible solution as optimal
Introduction • Note: Typically, a change in a current solution is “small” (local search) • Major difficulty: Local optimum vs. global optimum
Important Examples • Simplex method • Ford-Fulkerson algorithm for maximum flow problem • Maximum matching of graph vertices • Gale-Shapley algorithm for the stable marriage problem
Linear Programming • Linear programming (LP) problem is to optimize a linear function of several variables subject to linear constraints: maximize (or minimize) c1x1+ ...+ cnxn subject to ai1x1+ ...+ ainxn ≤ (or ≥ or =) bi , i = 1,...,m , x1≥ 0, ... , xn ≥ 0 The function z = c1x1+ ...+ cnxnis called the objective function; constraints x1≥ 0, ... , xn ≥ 0 are called non-negativity constraints
Example maximize 3x + 5y subject to x + y ≤ 4 x + 3y ≤ 6 x ≥ 0, y ≥ 0 Feasible region is the set of points defined by the constraints
Geometric solution maximize 3x + 5y subject to x + y ≤ 4 x + 3y ≤ 6 x ≥ 0, y ≥ 0 Extreme Point Theorem Any LP problem with a nonempty bounded feasible region has an optimal solution; moreover, an optimal solution can always be found at an extreme point of the problem's feasible region.
Possible outcomes in solving an LP problem • has a finite optimal solution, which may not be unique • unbounded: the objective function of maximization (minimization) LP problem is unbounded from above (below) on its feasible region • infeasible: there are no points satisfying all the constraints, i.e. the constraints are contradictory
The Simplex Method • Simplex method is the classic method for solving LP problems, one of the most important algorithms ever invented • Invented by George Dantzig in 1947 (Stanford University) • Based on the iterative improvement idea: Generates a sequence of adjacent points of the problem’s feasible region with improving values of the objective function untilno further improvement is possible
Standard form of LP problem • must be a maximization problem • all constraints (except the non-negativity constraints) must be in the form of linear equations • all the variables must be required to be nonnegative • Thus, the general linear programming problem in standard form with m constraints and n unknowns (n ≥ m) is • maximize c1x1+ ...+ cnxn • subject toai1x1+ ...+ ainxn= bi ,, i = 1,...,m, • x1≥ 0, ... , xn ≥ 0 • Every LP problem can be represented in such form
Example maximize 3x + 5y maximize 3x + 5y + 0u + 0v subject to subject to x + y ≤ 4 x + y + u = 4 x + 3y ≤ 6 x + 3y + v = 6 x≥0, y≥0 x≥0, y≥0, u≥0, v≥0 Variables u and v, transforming inequality constraints into equality constrains, are called slack variables
Basic feasible solutions A basic solution to a system of m linear equations in nunknowns (n ≥ m) is obtained by setting n – mvariables to 0 and solving the resulting system to get the values of the other m variables. The variables set to 0 are called nonbasic; the variables obtained by solving the system are called basic. A basic solution is called feasibleif all its (basic) variables are nonnegative. Example x + y + u = 4 x + 3y + v = 6 (0, 0, 4, 6) is basic feasible solution (x, y are nonbasic; u, v are basic)
Simplex Tableau maximize z = 3x + 5y + 0u + 0v subject to x + y + u = 4 x + 3y + v = 6 x≥0, y≥0, u≥0, v≥0
Outline of the Simplex Method Step 0 [Initialization] Present a given LP problem in standard form and set up initial tableau. Step 1 [Optimality test] If all entries in the objective row are nonnegative — stop: the tableau represents an optimal solution. Step 2 [Find entering variable] Select (the most) negative entry in the objective row. Mark its column to indicate the entering variable and the pivot column.
Outline of the Simplex Method • Step 3 [Find departing variable] • For each positive entry in the pivot column, calculate the θ-ratio by dividing that row's entry in the rightmost column by its entry in the pivot column. • (If there are no positive entries in the pivot column — stop: the problem is unbounded.) • Find the row with the smallest θ-ratio, mark this row to indicate the departing variable and the pivot row. • Step 4 [Form the next tableau] • Divide all the entries in the pivot row by its entry in the pivot column. • Subtract from each of the other rows, including the objective row, the new pivot row multiplied by the entry in the pivot column of the row in question. • Replace the label of the pivot row by the variable's name of the pivot column and go back to Step 1.
basic feasible sol. (0, 0, 4, 6) z = 0 basic feasible sol. (0, 2, 2, 0) z = 10 basic feasible sol. (3, 1, 0, 0) z = 14 Example of Simplex Method maximize z = 3x + 5y + 0u + 0v subject to x + y + u = 4 x + 3y + v = 6 x≥0, y≥0, u≥0, v≥0
Notes on the Simplex Method • Finding an initial basic feasible solution may pose a problem • Theoretical possibility of cycling • Typical number of iterations is between m and 3m, where m is the number of equality constraints in the standard form. Number of operations per iteration: O(nm) • Worse-case efficiency is exponential
Improvements • L. G. Khachian introduced an ellipsoid method (1979) that seemed to overcome some of the simplex method's limitations. O(n6). Disadvantage – runs with the same complexity on all problems • Narendra K. Karmarkar of AT&T Bell Laboratories proposed in1984 a new very efficient interior-point algorithm - O(n 3.5). In empirical tests it performs competitively with the simplex method.
Definitions Matching Free Vertex
Definitions • Maximum Matching: matching with the largest number of edges
Definition • Note that maximum matching is not unique.
Intuition • Let the top set of vertices be men • Let the bottom set of vertices be women • Suppose each edge represents a pair of man and woman who like each other • Maximum matching tries to maximize the number of couples!
Alternating Path • Alternating between matching and non-matching edges. a c d e b f h i j g d-h-e: alternating path a-f-b-h-d-i: alternating path starts and ends with free vertices f-b-h-e: not alternating path e-j: alternating path starts and ends with free vertices
Idea • “Flip” augmenting path to get better matching • Note: After flipping, the number of matched edges will increase by 1!
Idea • Theorem (Berge 1975): A matching M in G is maximum iffThere is no augmenting path Proof: • () If there is an augmenting path, clearly not maximum. (Flip matching and non-matching edges in that path to get a “better” matching!)
Idea of Algorithm • Start with an arbitrary matching • While we still can find an augmenting path • Find the augmenting path P • Flip the edges in P
Labelling Algorithm • Start with arbitrary matching
Labelling Algorithm • Pick a free vertex in the bottom
Labelling Algorithm • Run BFS
Labelling Algorithm • Alternate unmatched/matched edges
Labelling Algorithm • Until a augmenting path is found
Repeat • Pick another free vertex in the bottom
Repeat • Run BFS
Repeat • Flip
Answer • Since we cannot find any augmenting path, stop!
Overall algorithm • Start with an arbitrary matching (e.g., empty matching) • Repeat forever • For all free vertices in the bottom, • do bfs to find augmenting paths • If found, then flip the edges • If fail to find, stop and report the maximum matching.
Time analysis • We can find at most |V| augmenting paths • To find an augmenting path, we use bfs! Time required = O( |V| + |E| ) • Total time: O(|V|2 + |V| |E|)
Improvement • We can try to find augmenting paths in parallel for all free nodes in every iteration. • Using such approach, the time complexity is improved to O(|V|0.5 |E|)
The Stable Marriage Problem • Consider the following information: • a collection of eligible men, each with a rank ordered list of to be husband • a collection of eligible women, each with a rank ordered list of to be wife • Assume that: • the number of men and women are equal • all men rank all women, and vice versa • rank ordered lists are in decreasing order of preference "The Match Maker"
The Stable Marriage Problem (cont’d) • We define a marriage arrangement to be a pairing of the men and women such that: • each man is paired with one woman and • every person appears in exactly one pair. • We say that a marriage arrangement is stable if there does not exist a woman X and man Y such that: • X prefers Y over X's husband, and • Y prefers X over Y's wife. • Note: If there would exist such people, then they would want to run off together, creating an unstable situation. "The Match Maker"
The Stable Marriage Problem (cont’d) • The problem is: • “Given the preference rankings of the men and women, construct a stable marriage arrangement.” "The Match Maker"
The Stable Marriage Problem • The method For solving this problem works something like this: • The first man will propose to the first woman on his list. • She having no better offer at this point, will accept. • The second man will then propose to his first choice, and so on. • Eventually it may happen that a man proposes to a woman who already has a partner. "The Match Maker"
The Stable Marriage Problem (cont’d) • She will compare the new offer to her current partner and will accept whoever is higher on her list. • The man she rejects will then go back to his list and propose to his second choice, third choice, and so forth until he comes to a woman who accepts his offer. • If this woman already had a partner, her old partner gets rejected and he in turn starts proposing to women further down his list. • Eventually everything gets sorted out. "The Match Maker"
The Stable Marriage Problem (cont’d) • Algorithm: • Each person starts with no people "canceled" from his or her list. • People will be canceled from lists as the algorithm progresses. • For each man m, do propose(m), as defined below. "The Match Maker"
The Stable Marriage Problem (cont’d) • propose(m): • Let W be the first uncanceled woman on m's preference list. • Do: refuse(W,m), as defined in the next slide. • refuse(W,m): • Let m' be W 's current partner (if any). • If W prefers m' to m, then she rejects m, in which case: • cancel m off W 's list and W off m's list; • do: propose(m). (Now m must propose to someone else.) • Otherwise, W accepts m as her new partner, in which case: • cancel m' off W 's list and W off m' 's list; • do: propose(m'). (Now m' 's must propose to someone else.) "The Match Maker"