1 / 35

Linear Programming

Linear Programming. The basis of the whole field of operations research. Overview. Classes of Optimization Problems Linear Programming The Simplex Algorithm. Classes of Optimization Problems. Optimization. unconstrained. constrained. . linear. non-linear. real-valued. integer.

lorene
Download Presentation

Linear Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Programming The basis of the whole field of operations research

  2. Overview • Classes of Optimization Problems • Linear Programming • The Simplex Algorithm

  3. Classes of Optimization Problems Optimization unconstrained constrained ... linear non-linear real-valued integer quadratic ... ...

  4. Solution Spaces A solution space or feasible region is the union of all points in the domain that satisfy the problem constraints. The most important distinction is between convex and non-convex solution spaces: Convexity means that any interpolation between feasible points only yields feasible points.

  5. Local and Global Optima Plot of an objective function for a non-convex problem with local maxima at a and b. Only the one at b is a global maximum. Convex problems are generally easier to solve because Theorem: Any local extremum for a maximization problem on a convex feasible region (and concave objective function) is a global extremum.

  6. Convex Optimization Problems Convex problems are by far easier to solve (computationally less expensive). We will therefore first look at linear programming in the domain of real numbers.

  7. Adequate Minimum Cost Diet Nutrition Values in units per Dollar cost (1945) • One of the first automated linear optimization problems • G. Stigler. “The cost of expenditure”. Journal of Farm Economics, 1945 • “... there does not appear to be any direct method of finding the • minimum...” • G. Dantzig. “Linear Programming and Extensions”,Princeton Univ. Press, 1963. • manually with desktop calculator: 120 person-days • with Simplex on IBM 701: 4 minutes (1947)

  8. Linear Programming

  9. Gauss-Jordan Elimination Let us first review how a system of linear equations is solved. This is performed by Gauss-Jordan Elimination: Example: (1) 3x+5y-z = 15 (2) 7x-2y+z = 1 (3) y+z = 0 by (3) z=-y (1.1) 3x+5y+y = 15 (1.2) 7x- 2y- y = 1 by (1.1) y=5/2-1/2*x (1.2.1) 17/2*x = 17/2 x = 1 more elegantly:

  10. Application Example A company wants to optimize their production plan for three products B/C/D with the following resource requirements and limits: maximize profit f(x*) = 3.0 x[b] + 1.0 x[c] + 3.0 x[d] subject to c(x*) = 2.0 x[b] + 2.0 x[c] + 1.0 x[d] <= 30.0 and 1.0 x[b] + 2.0 x[c] + 3.0 x[d] <= 25.0 and 2.0 x[b] + 1.0 x[c] + 1.0 x[d] <= 20.0 and x[b]>=0 and x[c]>=0 and x[d]>=0

  11. Slack Variables • transform to standard form by introducing slack variables • the “slack” measures the unused part of a resource (i.e. how “tight” a constraint is) s[f] == 30 – 2 x[b] – 2 x[c] – x[d] s[l] == 35 – x[b] – 2 x[c] – 3 x[d] s[m] == 20 – 2 x[b] – x[c] – x[d] Note: All slack variables must always be positive! s[_] >= 0. In standard form the x[i] must also be positive. s[f] == 21 s[l] == 11 s[m] == 13 e.g. {x[b] -> 1, x[c] -> 2, x[d] -> 3} yields

  12. Basic Feasible Solution • A basic feasible solution is a point in the feasible region • (i.e. a valuation of the problem variables) that fulfils all • problem constraints, i.e. a point at which • all explicit problem constraints are fulfilled and • all slack variables are positive z == 3 x[b] + x[c] + 3 x[d] s[f] == 30 – 2 x[b] – 2 x[c] – x[d] s[l] == 35 – x[b] – 2 x[c] – 3 x[d] s[m] == 20 – 2 x[b] – x[c] – x[d] trivial solution: {x[b] -> 0, x[c] -> 0, x[d] -> 0} z=0 better: {x[b] -> 1, x[c] -> 2, x[d] -> 3} z=14 even better: {x[b] -> 2, x[c] -> 4, x[d] -> 5} z=25 impossible: {x[b] -> 3, x[c] -> 4, x[d] -> 5} s[l]<0

  13. Simplex Algorithm: Idea • The solution to a linear programming problem can be found • by iterative improvement: • (1) take any feasible solution • (2) check whether any resources are left • (3) check whether exchanging some “activity” (product) • improves the solution • (4) if so, exchange & repeat from step (2) • otherwise the optimum is reached.

  14. Simplex: Geometric Interpretation tight constraint s[i]=0 feasible region • Observe that the optimimum of the objective function can • only exists at a corner point of the simplex: • The objective function is linear. • Imagine the objective plane z=f(x,y)=ax+by • At all other points some resource has reserves that can be utilized by producing more, thus optimizing profit

  15. How to start? We will return to the problem of finding a good initial solution later. In the example case (and in many practical cases), the trivial solution (x*=0*) is feasible, but this is not always true. For now it seems reasonable to start with a solution in which only a single activity is chosen. This should be the most profitable activity (here product B) and it should be as large as possible, i.e. to the point where the most limiting constraint becomes tight (here s[m]). productionPlan = { x[b] -> 10, x[c] -> 0, x[d] -> 0}; z == 30 s[f] == 21 s[l] == 11 s[m] == 13

  16. Economic Position The row of the tableau that specifies the limiting constraint is called the Pivot Row. s[m] == 20 – 2 x[b] – x[c] – x[d] Solving the Pivot row for the current activity yields This describes the current activity in terms of other possible activities and available resources. Replacing this into the objective function yields: This is the “economic position” at the current tentative production plan.

  17. Shadow Prices intercept shadow price (3/2) opportunity cost (3/2) • The intercept gives the profit for the current production plan • The shadow price gives the amount by which the profit could be increased if this resource constraint could be relaxed (i.e. the break even price for buying more of this resource) • The opportunity cost gives the potential increase of profit by increasing the respective activity afteraccounting for necessary reduction in the current activity(e.g. one unit less b enables us to produce two more units d with the same price ($3) as b unless another constraint becomes tight.) This interpretation is only valid while the current limiting constraint remains binding! (i.e. in particular for the current tentative production plan)

  18. Solution Revision Obviously, the opportunity costs should lead our way for revising the production plan if we want to maximize profit. Linearity of the problem implies that we should substitute as much of the activity with the highest opportunity cost (here D) as the resource constraints permit. To find the most limiting constraint we substitute the pivot row (solved for the current activity x[b]) into the constraints:

  19. Stopping Condition By comparison of the coefficients we can see that s[l] will be the limiting constraint for x[d] Obviously we can replace 6 units of D (for 3 units of B) before the s[l] becomes tight. The new production plan is thus {x[b] -> 7, x[c] -> 0, x[d] -> 6} z == 39 s[f] == 10 s[l] == 0 s[m] == 0 Is this optimal??? We have to repeat the above process until no improvement possible. When all opportunity costs are less or equal zero (<=0), the current solution must be optimal and we can stop. What is a reasonable stopping condition?

  20. Pivoting 1. identify best substitute x[j] from highest opportunity cost 2. identify limiting resource s[i] 3. select pivot row according to (2) 4. solve pivot row for variable x[j] 5. substitute (4) in all other equations 6a. terminate if all opportunity costs are negative or zero 6b otherwise goto step 1

  21. z == 3 x[b] + x[c] + 3 x[d] s[f] == 30 – 2 x[b] – 2 x[c] – x[d] s[l] == 35 – x[b] – 2 x[c] – 3 x[d] s[m] == 20 – 2 x[b] – x[c] – x[d] pivoting on x[b] / s[m] yields: pivoting on x[d] / s[l] yields the optimum solution:

  22. Terminology In linear programming, the problem representation by a set of equations, like we have used them, is called a dictionary. A more compact representation by a matrix of coefficients used in actual implementations is called the tableau. The left-hand side variables (i.e. the variables that do only occur in one equation) are called basic. All other variables are called non-basic. The final dictionary/tableau yields the optimum solution by setting all non-basic variables to zero.

  23. Potential Problems Optimum not found Non-termination Optimum non-existent Problem unbounded Problem infeasible Cycling abort if there is no limiting resource for some pivot. Relax constraints & check feasibility In practice almost irrelevant can be avoided by clever choice of pivot elements. See: V. Chvatal. Linear Programming. W.H. Freeman, 1983. ? ? ? e.g. max (x+y) s.t. x-y <= 0

  24. Degenerate Solutions & Cycling A basic feasible solution is called degenerate if at least one of the basic variables takes the value 0. => the next pivot does not necessarily change the solution (possibly only variables with value 0 are swapped) => Cycling can occur (worst case) • Bland’s Anti-Cycling Rule • Number the variables. • In case of ties, let the lowest numbered variable enter the basis. • If there is a tie for choosing the exit variable, use the lowest numbered variable.

  25. Relaxing a Linear System A linear problem is infeasible if the constraints are too restrictive. z == x[1] – x[2] + x[3] s[1] == 4 – 2 x[1] + x[2] – 2 x[3] s[2] == -5 – 2 x[1] + 3 x[2] – x[3] s[3] == -1 + x[1] – x[2] + 2 x[3] The trivial solution is infeasible. Is the problem feasible at all? To answer this question we add a so-called artificial variable a[0] to the system, such that the system is always feasible. The original system is feasible iff there is a solution to the relaxed system with a[0]=0: z == a[0] s[1] == 4 + a[0] – 2 x[1] + x[2] – 2 x[3] s[2] == -5 + a[0] – 2 x[1] + 3 x[2] – x[3] s[3] == -1 + a[0] + x[1] – x[2] + 2 x[3] most stringent constraint

  26. Checking Feasibility By pivoting on the artificial variable a[0] (and the most negative intercept) we get a basic feasible solution for the relaxed system z == -5 – s[2] – 2 x[1] + 3 x[2] – x[3] s[1] == 9 + s[2] – 2 x[2] – x[3] a[0] == 5 + s[2] + 2 x[1] – 3 x[2] + x[3] s[3] == 4 + s[2] + 3 x[1] – 4 x[2] + 3 x[3] Applying the Simplex method to this tableau yields the final tableau Therefore the original problem is feasible with x*={0, 11/5, 8/5}

  27. Artificial Variables More generally, for any linear problem we can introduce artificial variables a[i] that make the relaxed problem trivially feasible. For example Can be rewritten to Note however that, because of the s[i], here we would not need all a[i]

  28. Two Phase Simplex We can now answer the question how we obtain a basic feasible solution for a linear programming problem. We simply perform two phases of Simplex optimization: (1) relax the original problem with artificial variables a[i] obtain the trivial initial solution for the relaxed problem by substituting for a[i] in the objective function -sum(a[i]) drive sum(a[i]) to 0 by maximizing -sum(a[i]) using Simplex (2a) If the optimal value of the objective function is >0 the original problem is infeasible (2b) Otherwise set a[i]=0 in the final tableau of the revised problem substitute the resulting equations of the final tableau from phase I into the original objective function (2c) combine revised objective function with equations from phase I ..and perform Simplex on the thus obtained tableau.

  29. Example For the example problem, we start phase II with the tableau ( originally: x[1]-x[2]+x[3]) For this tableau Simplex generates the solution

  30. Redundant Constraints It can happen that not all artificial variables are parametric (non-basic) after phase 0 --- we cannot set them to zero! Let the equation for this variable be intercept must be 0, otherwise problem is infeasible • Two possibilities: • All variables on the RHS are artificial: • Delete this equation and the artificial variable aj. • This happens if one of the original constraints was redundant. • At least one (non-artificial) variable Xi has a coefficient non-zero: • pivot out aj (Xi enters basis). • Repeat until all artificial variables are zero.

  31. Minimization Of course, Simplex can also handle minimization problems, because is the same as Also other inequalities and equalities are easily translated to standard form: This way of handling equations increases the tableau significantly.

  32. Complexity • Linear Programming is known to be polynomial • (by reduction to LSI - linear strict inequalities) • Simplex is exponential in the worst-case • (construct a polytope with exponentially many vertices • such that Simplex may trace all vertices) • but “well-behaved” in practice!

  33. Summary • Today we have looked at • Classes of Optimization Problems • Linear Programming • The Simplex Algorithm

  34. Exercise 1: Manual simplex Execute the simplex manually on the following example

  35. Exercise 2: Solve LPs in MiniZinc Model and solve the example of the slides using MiniZinc

More Related