1 / 79

Minimum cost flows: Introduction and basic algorithms

Introduction Applications Optimality Conditions Primal Dual in LP Algorithms. Ben Klein – 11/03/13. Minimum cost flows: Introduction and basic algorithms. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms. Max flow – A reminder. The graph:

oriel
Download Presentation

Minimum cost flows: Introduction and basic algorithms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Ben Klein – 11/03/13 Minimum cost flows: Introduction and basic algorithms

  2. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Max flow – A reminder • The graph: • G=(V,E) is a directed graph • Capacity u(v,w) > 0 for every • Two distinguished vertices s and t. 4 u w 3 5 t 2 s 1 v z 2 2

  3. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Max flow – A reminder • The flow: • Flow is a function on the edges. • A feasible flow is a flow that satisfies: • The value of a flow f is • The max flow problem is to find a feasible flow f, with maximum value

  4. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • The graph: • G=(V,E) is a directed graph • Capacity u(v,w) for every • Balances: For every we will have a number b(v) • Cost c(v,w) for every 0 0 4,1 v2 v1 5,1 3,3 -3 5 1,1 v3 v4 3,4 -2 v5

  5. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • Assumptions: • All arc costs are nonnegative – No loss of generality – due to a known transformation which converts a min cost flow problem with negative costs to a one with non-negatives costs. • If then

  6. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • Assumptions: • All arc costs are nonnegative – No loss of generality – due to a known transformation which converts a min cost flow problem with negative costs to a one with non-negatives costs. • If then 4,1 v2 v1 2,3

  7. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • Assumptions: • All arc costs are nonnegative – No loss of generality – due to a known transformation which converts a min cost flow problem with negative costs to a one with non-negatives costs. • If then u 4,0 4,1 v2 v1 2,3 2,0 w

  8. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • The flow: • Flow is a function on the edges. • A feasible flow is a flow that satisfies: • The cost of a feasible flow f: • The min cost flow problem: is to find a feasible flow f, with the minimum cost.

  9. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Max flow – A reminder For a feasible flow x, G(x) is the residual network that corresponds to the flow x. We replace each arc by two arcs and . The arc has cost and residual capacity . The arc has cost and residual capacity . The residual network consists only of arcs with positive residual capacity.

  10. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows - Definition • As a linear program: • s.t.

  11. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • A simple observation: • But what is the value of the following expression:

  12. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • A simple observation: • But what is the value of the following expression:

  13. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • is not enough: 0 0 4,1 v2 v1 5,1 1,3 5 -3 1,1 v3 v4 3,4 -2 v5

  14. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • A feasible flow is a flow that satisfies: • A simple observation: A feasible flow does not depend on the cost values. 0 0 4,1 v2 v1 5,1 3,3 -3 5 1,1 v3 v4 3,4 -2 v5

  15. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • A feasible flow is a flow that satisfies: • A simple observation: A feasible flow does not depend on the cost values. 0 0 4 v2 v1 5 3 -3 5 1 v3 v4 3 -2 v5

  16. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • Reduction to max flow problem. • Add vertices s and t. • For every v such that b(v)>0: Add an edge (s,v) such that u(s,v)=b(v) • For every v such that b(v)<0: Add an edge (v,t) such that u(v,t)= -b(v) • There exists a feasible solution iff all the edges from s are saturated. s 0 0 4 v2 v1 5 5 3 -3 5 1 v3 v4 3 -2 v5 3 2 t

  17. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flows – Finding a Feasible Solution • How to find a feasible solution? • Reduction to max flow problem. • Add vertices s and t. • For every v such that b(v)>0: Add an edge (s,v) such that u(s,v)=b(v) • For every v such that b(v)<0: Add an edge (v,t) such that u(v,t)= -b(v) • There exists a feasible solution iff all the edges from s are saturated. s 4 v2 v1 5 5 3 1 v3 v4 3 v5 3 2 t

  18. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Maximum Flow A variation of the Min Cost Flow problem is to find a flow which is maximum, but has the lowest cost among the maximums. How do we solve it? 4,0 u w 3,1 5,3 t 2,1 s 1,1 v z 2,2 2,1

  19. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Maximum Flow Forget about the costs. Use your favorite Max Flow algorithm to find the value of the max flow, f*. 4 u w 3 5 t 2 s 1 v z 2 2

  20. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Maximum Flow Reduction to Min Cost Flow: Define: 0 0 4,0 u w 3,1 5,3 t f* 2,1 -f* s 1,1 v z 2,2 2,1 0 0

  21. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #1 We have T tasks and M machines ( We shall assume that every machine will execute only one task. Let be the cost/time of running the task on the machine . We want to find the assignment A*, of tasks to machines, with the minimum total cost/running time.

  22. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #1 0 0 c1,1 t1 m1 c1,2 0 0 c1,3 2 -2 0 0 A m2 B c2,1 0 0 c2,2 0 0 t2 m3 c2,3 All the capacities are 1.

  23. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 We have a bus that can accommodate P passengers at a given time. The bus stops at L stations (First at station #1, then at station #2 and finally at station #L). At every station i, we know in advance how many passengers are willing to take the bus from station i to station j (j > i). We shall denote this number by pij. The price of a single ticket for taking the bus from station i to station j is cij. Our goal is to find a plan which brings the profit to maximum.

  24. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 If one desires to find the maximum, M, of a function g(x), he/she can instead find the minimum, m, of the function and then we have that

  25. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 • As a linear program: • s.t.

  26. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 • As a linear program: • s.t.

  27. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 • As a linear program: • s.t.

  28. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Application #2 p1,4 p2,4 p3,4 1-4 2-4 3-4 Red Edges have ∞ capacity 0 cost ∞,-c14 ∞,-c24 ∞,-c34 1-3 2-3 p1,3 p2,3 ∞,-c13 1-2 ∞,-c23 p1,2 ∞,-c12 1 2 3 4 P,0 P,0 P,0 -p1,2 -p1,3–p2,3 -p1,4–p2,4 –p3,4

  29. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Minimum Cost Flow – Optimality Conditions • We’ll consider three different optimality conditions: • Negative cycle • Reduced cost • Complementary slackness

  30. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Negative Cycle Theorem: A feasible solution x* is an optimal solution of the MCF problem if and only if it satisfies the negative cycle optimality conditions: namely, the residual network G(x*) contains no negative cost (directed) cycle.

  31. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Negative Cycle Proof: If x is feasible and there is a negative cycle in G(x), then x is not an optimal solution Suppose that x is a feasible flow and that G(x) contains a negative cycle, C. Then by simply augmenting flow through C, we got a new solution x’ which is feasible and has a lower cost. Therefore x is not an optimal solution to the MCF. b -5 4 c a -2 1 Residual network d

  32. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Negative Cycle Proof: If x is feasible and there is no negative cycle in G(x), then x is an optimal solution

  33. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Negative Cycle Help is needed: Theorem (Augmenting Cycle Theorem): Let x and y be any two feasible solutions of a network flow problem. Then the cost of x equals the cost of y plus the cost of the flow on at most m directed cycles in G(y). Now we are ready: Let x be a feasible flow and there is no negative cycle in G(x). Let x* be the optimal flow. Then . But x and x* are both feasible solutions of the network. Therefore: . Therefore and x is an optimal solution.

  34. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Node potentials: Let be a labeling of the nodes in V. We shall call , the potential of the node v. For a given labeling , we define the reduced cost of an arc (u,v) as:

  35. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Node potentials: Property 1: For any directed path P from node u to node v: v0 v1 vk-1 vk … =

  36. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Node potentials: Property 2: For any directed cycle C: v0 v1 vk-1 v0 … Using property 1:

  37. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Theorem: A feasible solution x* is an optimal solution of the minimum cost flow problem if and only if some set of node potentials satisfy the following reduced cost optimality conditions:

  38. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Proof: if for a feasible solution x*, some set of node potentials satisfy the following reduced cost optimality conditions: then x* is an optimal solution Let C be a cycle in G(x*). Therefore there is no negative cycle in G(x*)  x* is optimal according to the negative cycle optimality conditions.

  39. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Reduced cost Proof: if x* is an optimal solution, then some set of node potentials satisfy the following reduced cost optimality conditions: x* is an optimal solution and therefore there is no negative cycle in G(x*). No negative cycle  shortest paths are well defined in G(x*) We shall choose an arbitrary node s. Let denote the shortest path from s to v. Therefore: We shall denote Then: u s v

  40. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Complementary Slackness • Theorem: A feasible solution x* is an optimal solution of the MCF problem if and only if for some set of node potentials , the reduced costs and flow values satisfy the following complementary slackness conditions for every edge (u,v) in the network: • If then • If then • If then

  41. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Complementary Slackness • A feasible solution x* is optimal, if some set of reduced costs and the flow values satisfy the following complementary slackness conditions for every edge (u,v) in the network: • If then • If then • If then • Proof: • We shall prove that every edge (v,w) in G(x*) has a non-negative value. • By contradiction, let (v,w) be an edge in G(x*) with . • Then according to the third condition (v,w) is not in G(x*). Contradiction.

  42. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Complementary Slackness • If a feasible solution x* is optimal, then some set of node potentials satisfy the following the reduced costs and flow values satisfy the following complementary slackness conditions for every edge (u,v) in the network: • If then • If then • If then • Proof: • Let x* be an optimal solution. Therefore there exist s.t. every edge (v,w) in G(x*) has . • Case 1: Assume that and Then (j,i) is in G(x*) and  contradiction.

  43. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Complementary Slackness • If a feasible solution x* is optimal, then some set of node potentials satisfy the following the reduced costs and flow values satisfy the following complementary slackness conditions for every edge (u,v) in the network: • If then • If then • If then • Proof: • Let x* be an optimal solution. Therefore there exist s.t. every edge (v,w) in G(x*) has . • Case 2: If . Therefore the residual network contains both (i,j) and (j,i). • If then (i,j) or (j,i) has a negative reduced cost and that is a contradiction.

  44. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms MCF - Optimality Conditions – Complementary Slackness • If a feasible solution x* is optimal, then some set of node potentials satisfy the following the reduced costs and flow values satisfy the following complementary slackness conditions for every edge (u,v) in the original network: • If then • If then • If then • Proof: • Let x* be an optimal solution. Therefore there exist s.t. every edge (v,w) in G(x*) has . • Case 3: If and , then (i,j) is in the residual graph. A contradiction.

  45. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Primal DUAL IN LP

  46. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Finding an Upper Bound on OPT min 9x1 + 2x2 + 6x3 s.t. 4x1-x2+2x3 ≥ 22 x1+x2+x3≥ 10 x1,x2,x3 ≥0 How can one find an upper bound on OPT? Any feasible solution to this problem is an upper bound! Look at (5,2,3) As an example. 4*5 – 2 + 2*3 = 24 ≥ 22 5+2+3 = 10 ≥ 10 Therefore, 9*5+2*2+6*3 = 67, is an upper bound to OPT

  47. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Finding a Lower Bound on OPT min 9x1 + 2x2 + 6x3 s.t. 4x1-x2+2x3 ≥ 22 x1+x2+x3≥ 10 x1,x2,x3 ≥0 How can one find a lower bound on OPT? We can “play” with the constraints. 1) 9x1+2x2+6x3 ≥ 4x1 – x2 + 2x3 ≥ 22 2) 9x1+2x2+6x3 ≥ 1*(4x1-x2+2x3) + 3*( x1+x2+x3) = 7x1+2x2+5x3 ≥ 22 + 30 = 52 An optimal solution to our problem is (1,0,9) with the value = 63

  48. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Finding a Lower Bound on OPT - Generalization min 9x1 + 2x2 + 6x3 s.t. 4x1-x2+2x3 ≥ 22 (y1) x1+x2+x3≥ 10 (y2) x1,x2,x3 ≥0 For which y=(y1,y2) we can say that 9x1+2x2+6x3 ≥ y1(4x1-x2+2x3)+y2(x1+x2+x3)≥10y1+22y2? 9x1 + 2x2 + 6x3 ≥ y1(4x1-x2+2x3) + y2(x1+x2+x3) = (4y1+y2)x1 + (-y1+y2)x2 + (2y1+y2)x3 Any y s.t. 4y1+y2 ≤ 9 -y1+y2 ≤ 2 2y1+y2 ≤ 6 y1,y2≥0 And of course that we want the greatest lower bound, so we want max 10y1 + 22y2

  49. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms The Dual Problem So while searching for a good lower bound, we created a new LP problem. We will call to the original problem – The Primal, and to the new one – The Dual. The Primal: min 9x1 + 2x2 + 6x3 s.t. 4x1-x2+2x3 ≥ 22 x1+x2+x3≥ 10 x1,x2,x3 ≥0 The Dual: max 22y1 + 10y2 s.t. 4y1 +y2 ≤ 9 -y1 + y2 ≤ 2 2y1 + y2 ≤ 6 y1,y2 ≥ 0 min cTx s.t. Ax ≥ b x ≥ 0 max bTy s.t. ATy ≤ c y ≥ 0 The dual of the dual is the primal

  50. Introduction Applications Optimality Conditions Primal Dual in LP Algorithms Weak Duality - Theorem Note: In this lecture, Primal is a min problem  Dual is a max problem. We will refer to the Primal as (P) and to the Dual as (D). Theorem (Weak Duality): For any x feasible in (P) and y feasible in (D) we have. Proof: y is feasible for the dual. x is feasible for the primal. min cTx s.t. Ax ≥ b x ≥ 0 max bTy s.t. ATy ≤ c y ≥ 0

More Related