Lecture 5 Chapter 3 Improving Search
Download
Report
Transcript Lecture 5 Chapter 3 Improving Search
Lecture 5
Chapter 3 Improving Search
Local Improvement
Hill Climbing
Local Search
Neighborhood Search
Def 3.3 P 83 Improving searches are algorithms that
begin at a feasible point and move along a search
path of feasible points with improving objective
value.
1
Hill Climbing
Feasible
Region
300
400
200
100
Maximum
2
Neighborhood
The neighborhood of a point is all nearby points.
I haven’t
defined
nearby yet.
3
Local & Global Optimum
Def 3.5 P 85 Local Optimum – point must be feasible
and in a small neighborhood no feasible point is
better
Def 3.7 P 85 Global Optimum – point must be
feasible and no other feasible point is better
4
Example
Point
X
Y
Z
Elevation
10K
15.3K
12.8K
X
Y
Z
Any Local Optima?
Any Global Optima?
What is W?
W
5
Improving Search
Def 3.12 P 88 Improving searches move from point 1
to point 2 by taking a step size of lambda in the
direction delta x.
X2 = X1 + (X)
where is the step size and X is the direction
6
Simplex Algorithm
Obj Value
Improves at Each
Step
4
3
1
2
7
Interior Point Algorithm
4
Obj Value
Improves at Each
Step
3
2
1
8
Improving Direction
Def 3.13 P 90 Improving Direction – objective
function must improve for a sufficiently small step
size
C
B
What are the
improving
directions at
A, B, and C?
A
9
Improving Direction
Def 3.13 P 90 Improving Direction – objective
function must improve for a sufficiently small step
size
C
B
What are the
improving
directions at
A, B, and C?
A
10
Feasible Direction
Feasible Direction – for sufficiently small , the new
point is feasible.
At the point A is
the direction (0,1)
feasible?
Is (1,0) feasible?
(1,0) =
A
11
Algorithm 3A: Continuous
Improving Search
0. Initialization. Let X0 be an feasible point and set
t to 0.
1. Check for local optimum. If no improving feasible
direction exists, then stop with Xt as a local
optimum.
2. Move direction. Let d be an improving feasible
direction.
3. Step size. Let s be the largest possible step size
that leads to a better feasible solution.
4. Make your move. Set Xt+1 = Xt+(s)(d), set t to t+1
and go to 1.
12
Gradient
Def 3.19 P 99 The gradient of a function evaluated
at a point is the vector of partial derivatives
evaluated at the point.
Def 3.20 P 101 The gradient of a function evaluated
at a point is the direction of maximum increase.
Let f(X,Y) = X+2Y. The gradient of f at the point
(4,0) is [1,2]. Why?
13
Linear Functions
For linear functions the gradient is constant at all
points.
f(X,Y) = 50X+75Y
What is the
direction of max
increase? Looks like
(50,75) or (10,15) or
(2,3) or (1,1.5).
14
Nonlinear Function
For f(X,Y) = (x-2)2 + (Y-4)2
This is a circle with a center at (2,4). f increases
the further you move away from the center.
Gradient of f at (4,4) = [2(x-2), 2(y-4)] evaluated at
(4,4) is [4,0].
Draw it.
15
Important Result
Def 3.21 P 102 Let G(z) (1xn) be the gradient of f(x)
evaluated at the point z. Let d (nx1) be a direction.
G(z)d < 0 implies f decreases in the direction d
= 0 implies f stays the same in the direction d
> 0 implies f increases in the direction d
G(z)d is (1xn)(nx1) = 1x1 (a constant)
This allows us to determine if a direction is an
improving direction.
16
Example f(x,y) = (x-2)2+(y-2)2
G(0,0) = [2(x-2), 2(y-2)] at (0,0) = [-4,-4]
D
G.d
Change in f
[1,1]
-8
Decreases
[1,0]
-4
Decreases
[1,-1]
0
No Change
[-1,-1]
8
Increases
f(2,2) = 0
(2,2)
f(0,0) = 8
17
Improving Directions
Def 3.23 P 104. Let G(z) (1xn) be the gradient of
f(x) at the point z. If G(z) not 0, then d = G(z)
(nx1) is an improving direction for the
maximization problem. -G(z) is an improving
direction for the minimization problem.
18
Example f(x,y) = (x-2)2+(y-2)2
z
[0,0]
G(z)=
[2(x-2),2(y-2)]
[-4,-4]
[2,0]
[0,-4]
[0,2]
[-4,0]
[3,3]
[2,2]
[3,0]
[2,-4]
(0,2)
(2,2)
(2,0)
(0,0)
19
Feasible Directions
The question:
Is a given direction d feasible at the point z?
To answer this we must examine the constraints:
Three terms imply the same thing:
active at
tight at z
binding at z
Equality constraints are always tight!
20
What are the feasible directions
at z=[0.0]
Minimize f(x,y)=(x-2)2+(y-2)2
Subject to x-y > 0
x > 0, 0 < y < 4
Point
Y/N
[0,1]
[-1,-1]
[1,0]
[1,1]
21
What are the feasible directions
at z=[0.0]
Minimize f(x,y)=(x-2)2+(y-2)2
Subject to x-y > 0
x > 0, y > 0
Point
Y/N
[0,1]
NO
[-1,-1]
NO
[1,0]
YES
[1,1]
YES
22
Binding Constraints
x-y > 0, x > 0, and y > 0 are all binding
If d=[d1,d2] is a feasible direction at [0,0], then
there must be some > 0 such that the point
[0,0]+[d1,d2] satisfies all binding constraints.
1st constraint:
2nd constraint:
3rd constraint:
d1-d2 > 0
d1 > 0
d2 > 0
23
Feasible Directions
1st constraint:
3rd constraint:
d1-d2 > 0
d2 > 0
2nd constraint:
d1 > 0
Try d = [1,1]. > 0, > 0, > 0 implies can be infinite
Try d = [1,0]. > 0, > 0, 0 > 0 implies can be infinite
Try d = [0,1]. - > 0, 0 > 0, > 0 implies = 0, hence
direction not feasible
Try d = [1,-1]. 2 > 0, > 0, - > 0 implies =0, hence
direction not feasible
Try d = [-1,0]
24
To Obtain Local Optimum
For improving search, you need improving feasible
directions to get a local optimum.
Tractable Models (The Best Models For Applications)
Local Optimum = Global Optimum (see page 109)
Def 3.26 Unimodal Objective Function
x1 is feasible and x2 is feasible
f(x2) is better than f(x1), then d = x2 – x1 is an
improving direction
25
Unimodal
Unimodal
Not Unimodal
26
Linear Functions Are Unimodal
Def 3.28 P 112 If the objective function is unimodal,
then every unconstrained local optimum is an
unconstrained global optimum.
27
Convex Sets
Def 3.29 If x and y are in the set, then all points on
the line segment between x and y are also in the
set.
Convex Set
Convex Set
Not
Convex
Not
Convex
28
Discrete Sets
Def 3.30 Discrete Sets are not convex.
29
Linear Constraints
Def 3.32 If all constraints are linear, then the
feasible set of points is a convex set.
Convex
Set
30
Local Optimum = Global Optimum
Def 3.34
Objective function is unimodal
Constraint set is convex
Local = Global
Improving search works for this type of problem.
31