340 likes | 463 Views
L21 Numerical Methods part 1. Homework Review Search problem Line Search methods Summary. Test 4 Wed. Problem 8.95. H20 cont’d. H20 cont’d. a. Increase cost “by” $0.16, fnew=$53,238 or +$838 inc. b. Reduce mill A capacity to 200 logs/day Changes nothing.
E N D
L21 Numerical Methods part 1 • Homework • Review • Search problem • Line Search methods • Summary Test 4 Wed
H20 cont’d a. Increase cost “by” $0.16, fnew=$53,238 or +$838 inc b. Reduce mill A capacity to 200 logs/day Changes nothing c. Reduce mill B capacity to 270 logs/day, increases cost by $750 and new opt sol’n is x1=0, x2=30, x3=200, and x4=70
Sensitivity Analyses how sensitive are the: a. optimal value (i.e. f(x) and b. optimal solution (i.e. x) … to the parameters (i.e. assumptions) in our model?
Model parameters Consider your abc’s, i.e. A, b and c
Simplex LaGrange Multipliers Find the multipliers in the final tableau (right side) Know this!
Let’s minimize f even further Increase/decrease ei to reduce f(x)
Is there more to Optimization • Simplex is great…but…. • Many problems are non-linear • Many of these cannot be “linearized” Need other methods!
General Optimization Algorithms: • Sub Problem A Which direction to head next? • Sub Problem B How far to go in that direction?
Magnitude and direction Let u be a unit vector of length 1, parallel to a Alpha = magnitude or step size (i.e.scalar) Unit vector = direction (i.e. vector)
We are here Which direction should we head? Figure 10.2 Conceptual diagram for iterative steps of an optimization method.
Minimize f(x): Let’s go downhill! scalar Descent condition
Dot Product At what angle does the dot product become most negative? Max descent …..
Desirable Direction Descent is guaranteed!
Step Size? How big should we make alpha? Can we step too “far?” i.e. can our step size be chosen so big that we step over the “minimum?”
Nonunimodal functions Unimodal if stay in the locale? Figure 10.5 Nonunimodal function f() for 0 0
Monotonic Decreasing Functions continous
Unimodal functions: monotonic decreasing then monotonic increasing monotonic increasing then monotonic decreasing Figure 10.4 Unimodal function f().
Some Step Size Methods • “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f(α) Findf’(α)=0 • Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section
Analytical Step size Slope of line search= Figure 10.3 Graph of f() versus .
Alternative Analytical Step Size New gradient must be orthogonal to d for
Some Step Size Methods • “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f(α) Findf’(α)=0 • Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section
“Interval Reducing”Region elimination “bounding phase” Figure 10.6 Equal-interval search process. (a) Phase I: initial bracketing of minimum. (b) Phase II: reducing the interval of uncertainty. Interval reduction phase”
Successive-Equal Interval Algorithm “Interval” of uncertainty
Successive Equal Inteval Search • Very robust • Works for continuous and discrete functions • Lots of f(x) evaluations!!!
Alternate equal interval Figure 10.7 Graphic of an alternate equal-interval solution process.
Summary • Sensitivity Analyses add value to your solutions • Sensitivity is as simple as Abc’s • Constraint variation sensitivity theoremcan answer simple resource limits questions • General Opt Alg’ms have two sub problems: search direction, and step size • In local neighborhood.. Assume uimodal! • Descent condition assures correct direction • Step size methods: analytical, region elimin.