Linear Programming (Optimization)
Download
Report
Transcript Linear Programming (Optimization)
Chapter 1. Introduction
๏ฑ Mathematical Programming (Optimization) Problem:
min/max
๐(๐ฅ)
subject to
๐๐ (๐ฅ) โค 0, ๐ = 1, โฆ , ๐,
(โ๐ ๐ฅ = 0, ๐ = 1, โฆ , ๐)
(๐ฅ โ ๐ โ ๐
๐ )
๐, ๐๐ , โ๐ : ๐
๐ โ ๐
๏ฑ If ๐, ๐๐ , โ๐ linear (affine) function ๏ฎ linear programming problem
If solution set (or some of the variables) restricted to be integer points ๏ฎ
integer programming problem
If ๐, ๐๐ , โ๐ (or part of them) nonlinear function ๏ฎ nonlinear programming
problem
If min ๐(๐ฅ), and ๐(๐ฅ), ๐๐ (๐ฅ) are convex functions, no equality constraints โ
convex programming problem.
Linear Programming 2015
1
๏ฑ Linear programming: problem of optimizing (maximize or minimize) a linear
(objective) function subject to linear inequality (and equality) constraints.
๏ฑ General form:
{max, min} ๐ โฒ ๐ฅ
subject to
๐๐โฒ ๐ฅ โฅ ๐๐ , ๐ โ ๐1
๐๐โฒ ๐ฅ โค ๐๐ , ๐ โ ๐2
๐๐โฒ ๐ฅ = ๐๐ , ๐ โ ๐3
๐ฅ๐ โฅ 0, ๐ โ ๐1 ,
๐ฅ๐ โค 0, ๐ โ ๐2
๐, ๐๐ , ๐ฅ โ ๐
๐
(There may exist variables unrestricted in sign)
๏ฑ inner product of two column vectors ๐ฅ, ๐ฆ โ ๐
๐ :
๐ฅ โฒ ๐ฆ = ๐๐=1 ๐ฅ๐ ๐ฆ๐
If ๐ฅ โฒ ๐ฆ = 0, ๐ฅ, ๐ฆ โ 0, then ๐ฅ, ๐ฆ are said to be orthogonal. In 3-D, the angle
between the two vectors is 90 degrees.
( vectors are column vectors unless specified otherwise)
Linear Programming 2015
2
๏ฑ Big difference from systems of linear equations is the existence of objective
function and linear inequalities (instead of equalities)
๏ฑ Much deeper theoretical results and applicability than systems of linear
equations.
๏ฑ ๐ฅ1 , ๐ฅ2 , โฆ , ๐ฅ๐ : (decision) variables
๐๐ : right-hand-side
๐๐โฒ ๐ฅ { ๏ณ, ๏ฃ, ๏ฝ } ๐๐ : i th constraint
๐ฅ๐ { ๏ณ, ๏ฃ } 0 : nonnegativity (nonpositivity) constraint
๐ โฒ ๐ฅ : objective function
๏ฑ Other terminology:
feasible solution, feasible set (region), free (unrestricted) variable, optimal
(feasible) solution, optimal cost, unbounded
Linear Programming 2015
3
Important submatrix multiplications
๏ฑ Interpretation of constraints: see as submatrix multiplication.
A: ๐ ร ๐ matrix
๏ฉ ๏ญ a1 ' ๏ญ ๏น ๏ฉ |
๏บ ๏ฝ ๏ชA
A๏ฝ๏ช
๏ช
๏บ ๏ช 1
๏ช๏ซ ๏ญ am ' ๏ญ ๏บ๏ป ๏ช๏ซ |
| ๏น
An ๏บ
๏บ
| ๏บ๏ป
Ax ๏ฝ ๏ฅnj๏ฝ1 A j x j ๏ฝ ๏ฅim๏ฝ1 ai ' xei , where ๐๐ is i-th unit vector
y' A ๏ฝ ๏ฅim๏ฝ1 yi ai ' ๏ฝ ๏ฅnj๏ฝ1 y' A j e j '
denote constraints as ๐ด๐ฅ { ๏ณ, ๏ฃ, ๏ฝ } ๐
Linear Programming 2015
4
๏ฑ Any LP can be expressed as min ๐ โฒ ๐ฅ, ๐ด๐ฅ โฅ ๐
max ๐ โฒ ๐ฅ ๏ฎ min (โ๐ โฒ ๐ฅ) and take negative of the optimal cost
๐๐ โฒ๐ฅ โค ๐๐ ๏ฎ โ๐๐โฒ ๐ฅ โฅ โ๐๐
๐๐โฒ ๐ฅ = ๐๐ ๏ฎ ๐๐โฒ ๐ฅ โฅ ๐๐ , โ๐๐โฒ ๐ฅ โฅ โ๐๐
nonnegativities (nonpositivities) are special cases of inequalities which will
be handled separately in the algorithms and duality theory.
Feasible solution set of LP can always be expressed as ๐ด๐ฅ โฅ ๐ (or ๐ด๐ฅ โค ๐)
(called polyhedron, a set which can be described as a solution set of finitely
many linear inequalities)
๏ฑ We may sometimes use max ๐ โฒ ๐ฅ, ๐ด๐ฅ โค ๐ form (especially, when we study
polyhedron)
Linear Programming 2015
5
Standard form problems
๏ฑ Standard form : min ๐ โฒ ๐ฅ, ๐ด๐ฅ = ๐, ๐ฅ โฅ 0
Ax ๏ฝ ๏ฅnj๏ฝ1 A j x j ๏ฝ ๏ฅim๏ฝ1 ai ' xei
Two view points:
๏Find optimal weights (nonnegative) from possible nonnegative linear
combinations of columns of A to obtain b vector
๏Find optimal solution that satisfies linear equations and nonnegativity
๏ฑ Reduction to standard form
Free (unrestricted) variable ๐ฅ๐ ๏ฎ ๐ฅ๐+ โ ๐ฅ๐โ ,
๐ฅ๐+ , ๐ฅ๐โ โฅ 0
๐ ๐๐๐ ๐ฅ๐๐
โค ๐๐
๏ฎ
๐ ๐๐๐ ๐ฅ๐๐
+ ๐ ๐ = ๐๐ ,
๐ ๐ โฅ 0 (slack variable)
๐ ๐๐๐ ๐ฅ๐๐
โฅ ๐๐
๏ฎ
๐ ๐๐๐ ๐ฅ๐๐
โ ๐ ๐ = ๐๐ ,
๐ ๐ โฅ 0 (surplus variable)
Linear Programming 2015
6
๏ฑ Any (practical) algorithm can solve the LP problem in equality form only
(except nonnegativity)
๏ฑ Modified form of the simplex method can solve the problem with free
variables directly (without using difference of two variables).
It gives more sensible interpretation of the behavior of the algorithm.
Linear Programming 2015
7
1.2 Formulation examples
๏ฑ See other examples in the text.
๏ฑ Minimum cost network flow problem
Directed network ๐บ = (๐, ๐ด), ( ๐ = ๐ )
arc capacity ๐ข๐๐ , (๐, ๐) โ ๐ด, unit flow cost ๐๐๐ , (๐, ๐) โ ๐ด
๐๐ : net supply at node i (๐๐ > 0: supply node, ๐๐ < 0: demand node), (We
may assume ๐โ๐ ๐๐ = 0)
Find minimum cost transportation plan that satisfies supply, demand at each
node and arc capacities.
minimize
(๐,๐)โ๐ด ๐๐๐ ๐ฅ๐๐
subject to
{๐:(๐,๐)โ๐ด} ๐ฅ๐๐
โ
๐: ๐,๐ โ๐ด
๐ฅ๐๐ = ๐๐ ,
i = 1, โฆ, n
(out flow - in flow = net flow at node i, flow conservation constraints)
(some people use, in flow โ out flow = net flow)
๐ฅ๐๐ โค ๐ข๐๐ ,
(๐, ๐) โ ๐ด
๐ฅ๐๐ โฅ 0,
Linear Programming 2015
(๐, ๐) โ ๐ด
8
๏ฑ Choosing paths in a communication network ( (fractional)
multicommodity flow problem)
๏ฑ Multicommodity flow problem: Several commodities share the network.
For each commodity, it is min cost network flow problem. But the
commodities must share the capacities of the arcs. Generalization of min
cost network flow problem. Many applications in communication,
distribution / transportation systems
๏Several commodities case
๏Actually one commodity. But there are multiple origin and destination pairs
of nodes (telecom, logistics, ..). Each origin-destination pair represent a
commodity.
๏ฑ Given telecommunication network (directed) with arc set A, arc capacity
๐ข๐๐ bits/sec, (๐, ๐) โ ๐ด, unit flow cost ๐๐๐ /bit , (๐, ๐) โ ๐ด, demand ๐ ๐๐
bits/sec for traffic from node k to node l.
Data can be sent using more than one path.
Find paths to direct demands with min cost.
Linear Programming 2015
9
Decision variables:
๐๐
๐ฅ๐๐
: amount of data with origin k and destination l that
traverses link (๐, ๐) โ ๐ด
Let ๐๐๐๐ = ๐ ๐๐
if ๐ = ๐
โ๐ ๐๐
if ๐ = ๐
0
otherwise
๏ฑ Formulation (flow based formulation)
๐๐
๐
๐ฅ
๐๐
๐
๐๐
minimize
(๐,๐)โ๐ด
subject to
๐๐
{๐:(๐,๐)โ๐ด} ๐ฅ๐๐
๐
โ
๐: ๐,๐ โ๐ด
๐ฅ๐๐๐๐ = ๐๐๐๐ ,
๐, ๐, ๐ = 1, โฆ , ๐
(out flow - in flow = net flow at node i for
commodity from node k to node l)
๐
๐๐
๐ ๐ฅ๐๐
โค ๐ข๐๐ ,
(๐, ๐) โ ๐ด
(The sum of all commodities should not exceed the
capacity of link (i, j) )
๐๐
๐ฅ๐๐
โฅ 0,
Linear Programming 2015
๐, ๐ โ ๐ด,
๐, ๐ = 1, โฆ , ๐
10
๏ฑ Alternative formulation (path based formulation)
Let K: set of origin-destination pairs (commodities)
๐ ๐ : demand of commodity ๐ โ ๐พ
P(k): set of all possible paths for sending commodity k๏K
P(k;e): set of paths in P(k) that traverses arc e๏A
E(p): set of links contained in path p
Decision variables:
๐ฆ๐๐ : fraction of commodity k sent on path p
minimize
subject to
๐ ๐
๐โ๐(๐) ๐ค๐ ๐ฆ๐
๐
๐โ๐(๐) ๐ฆ๐ = 1,
๐โ๐พ
๐โ๐พ
0
where ๐ค๐๐ = ๐๐
๐ ๐
๐โ๐(๐;๐) ๐ ๐ฆ๐
โค ๐ฆ๐๐ โค 1,
for all ๐ โ ๐พ
โค ๐ข๐ , for all ๐ โ ๐ด
for all ๐ โ ๐ ๐ , ๐ โ ๐พ,
๐โ๐ธ(๐) ๐๐
๏ฑ If ๐ฆ๐๐ โ {0,1}, it is a single path routing problem (path selection problem,
integer multicommodity flow problem).
Linear Programming 2015
11
๏ฑ path based formulation has smaller number of constraints, but enormous
number of variables.
can be solved easily by column generation technique (later).
Integer version is more difficult to solve.
๏ฑ Extensions: Network design - also determine the number and type of facilities
to be installed on the links (and/or nodes) together with routing of traffic.
๏ฑ Variations: Integer flow. Bifurcation of traffic may not be allowed. Determine
capacities and routing considering rerouting of traffic in case of network failure,
Robust network design (data uncertainty), ...
Linear Programming 2015
12
๏ฑ Pattern classification (Linear classifier)
Given m objects with feature vector ๐๐ โ ๐
๐ , ๐ = 1, โฆ , ๐.
Objects belong to one of two classes. We know the class to which each
sample object belongs.
We want to design a criterion to determine the class of a new object using the
feature vector.
Want to find a vector (๐ฅ, ๐ฅ๐+1 ) โ ๐
๐+1 with ๐ฅ โ ๐
๐ such that, if ๐ โ ๐, then
๐๐โฒ ๐ฅ โฅ ๐ฅ๐+1 , and if ๐ โ ๐, then ๐๐โฒ ๐ฅ < ๐ฅ๐+1 . (if it is possible)
Linear Programming 2015
13
๏ฑ Find a feasible solution (๐ฅ, ๐ฅ๐+1 ) that satisfies
๐๐โฒ ๐ฅ โฅ ๐ฅ๐+1 ,
๐โ๐
๐๐โฒ ๐ฅ < ๐ฅ๐+1 ,
๐โ๐
for all sample objects i
Is this a linear programming problem?
( no objective function, strict inequality in constraints)
Linear Programming 2015
14
๏ฑ Is strict inequality allowed in LP?
consider min x, x > 0 ๏ฎ no minimum point. only infimum of objective
value exists
๏ฑ If the system has a feasible solution (๐ฅ, ๐ฅ๐+1 ), we can make the difference of
the values in the right hand side and in the left hand side large by using
solution ๐(๐ฅ, ๐ฅ๐+1 ) for M > 0 and large. Hence there exists a solution that
makes the difference at least 1 if the system has a solution.
Remedy: Use ๐๐โฒ ๐ฅ โฅ ๐ฅ๐+1 ,
๐โ๐
๐๐โฒ ๐ฅ โค ๐ฅ๐+1 โ 1,
๐โ๐
๏ฑ Important problem in data mining with applications in target marketing,
bankruptcy prediction, medical diagnosis, process monitoring, โฆ
Linear Programming 2015
15
๏ฑ Variations
๏What if there are many choices of hyperplanes? any reasonable criteria?
๏What if there is no hyperplane separating the two classes?
๏Do we have to use only one hyperplane?
๏Use of nonlinear function possible? How to solve them?
โข SVM (support vector machine), convex optimization
๏More than two classes?
Linear Programming 2015
16
1.3 Piecewise linear convex objective functions
๏ฑ Some problems involving nonlinear functions can be modeled as LP.
๏ฑ Def: Function ๐: ๐
๐ โ ๐
is called a convex function if for all ๐ฅ, ๐ฆ โ ๐
๐ and
all ๏ฌ๏[0, 1]
๐ ๐๐ฅ + 1 โ ๐ ๐ฆ โค ๐๐ ๐ฅ + 1 โ ๐ ๐(๐ฆ).
( the domain may be restricted)
f called concave if โ๐ is convex
(picture: the line segment joining (๐ฅ, ๐ ๐ฅ ) and (๐ฆ, ๐ ๐ฆ ) in ๐
๐+1 is not
below the locus of ๐(๐ฅ) )
Linear Programming 2015
17
๏ฑ Def: ๐ฅ, ๐ฆ โ ๐
๐ , ๏ฌ1, ๏ฌ2 ๏ณ 0, ๏ฌ1+ ๏ฌ2 = 1
Then ๏ฌ1x + ๏ฌ2y is said to be a convex combination of x, y.
Generally, ๐๐=1 ๐๐ ๐ฅ ๐ , where ๐๐=1 ๐๐ = 1 and ๐๐ โฅ 0, ๐ = 1, โฆ , ๐ is a convex
combination of the points ๐ฅ 1 , โฆ , ๐ฅ ๐ .
๏ฑ Def: A set ๐ โ ๐
๐ is convex if for any ๐ฅ, ๐ฆ โ ๐, we have ๐1 ๐ฅ + ๐2 ๐ฆ โ ๐ for
any ๐1 , ๐2 โฅ 0, ๐1 + ๐2 = 1.
Picture:
๐1 ๐ฅ + ๐2 ๐ฆ = ๐1 ๐ฅ + 1 โ ๐1 ๐ฆ,
0 โค ๐1 โค 1
= ๐ฆ + ๐1 (๐ฅ โ ๐ฆ),
0 โค ๐1 โค 1
(line segment joining ๐ฅ, ๐ฆ lies in ๐)
x (๏ฌ1 = 1)
(๐ฅ โ ๐ฆ)
(๐ฅ โ ๐ฆ)
Linear Programming 2015
y (๏ฌ1 = 0)
18
๏ฑ If we have ๐1 ๐ฅ + ๐2 ๐ฆ, ๐1 + ๐2 = 1 (without ๐1 , ๐2 โฅ 0), it is called an affine
combination of ๐ฅ and ๐ฆ.
Picture:
๐1 ๐ฅ + ๐2 ๐ฆ = ๐1 ๐ฅ + 1 โ ๐1 ๐ฆ,
= ๐ฆ + ๐1 (๐ฅ โ ๐ฆ),
(๏ฌ1 is arbitrary)
(line passing through points ๐ฅ, ๐ฆ)
Linear Programming 2015
19
Picture of convex function
( x, f ( x)) ๏ R n ๏ซ1
f (x)
( y, f ( y ))
(๏ฌx ๏ซ (1 ๏ญ ๏ฌ ) y, ๏ฌf ( x) ๏ซ (1 ๏ญ ๏ฌ ) f ( y))
๏ฌ f ( x) ๏ซ (1 ๏ญ ๏ฌ ) f ( y)
f (๏ฌ x ๏ซ (1 ๏ญ ๏ฌ ) y)
x
Linear Programming 2015
๏ฌ x ๏ซ (1 ๏ญ ๏ฌ ) y
y
x ๏ Rn
20
๏ฑ relation between convex function and convex set
๏ฑ Def: ๐: ๐
๐ โ ๐
. Define epigraph of ๐ as epi(๐) = { ๐ฅ, ๐ โ ๐
๐+1 : ๐ โฅ ๐ ๐ฅ }.
๏ฑ Then previous definition of convex function is equivalent to epi(๐) being a
convex set. When dealing with convex functions, we frequently consider
epi(๐) to exploit the properties of convex sets.
๏ฑ Consider operations on functions that preserve convexity and operations on sets
that preserve convexity.
Linear Programming 2015
21
๏ฑ Example:
Consider ๐ ๐ฅ = ๐๐๐ฅ๐=1,โฆ,๐ (๐๐โฒ ๐ฅ + ๐๐ ), ๐๐ โ ๐
๐ , ๐๐ โ ๐
(maximum of affine functions, called a piecewise linear convex function.)
f (x)
๐1โฒ ๐ฅ + ๐1
๐2โฒ ๐ฅ + ๐2
๐3โฒ ๐ฅ + ๐3
x ๏ Rn
๐ฅ
Linear Programming 2015
22
๏ฑ Thm: Let ๐1 , โฆ , ๐๐ : ๐
๐ โ ๐
be convex functions. Then
๐ ๐ฅ = ๐๐๐ฅ๐=1,โฆ,๐ ๐๐ (๐ฅ) is also convex.
pf)
๐ ๐๐ฅ + 1 โ ๐ ๐ฆ = ๐๐๐ฅ๐=1,โฆ,๐ ๐๐ (๐๐ฅ + 1 โ ๐ ๐ฆ)
๏ฃ ๐๐๐ฅ๐=1,โฆ,๐ (๐๐๐ ๐ฅ + 1 โ ๐ ๐๐ ๐ฆ )
๏ฃ ๐๐๐ฅ๐=1,โฆ,๐ ๐๐๐ ๐ฅ + ๐๐๐ฅ๐=1,โฆ,๐ (1 โ ๐)๐๐ (๐ฆ)
= ๐๐ ๐ฅ + 1 โ ๐ ๐(๐ฆ)
๏ฟ
Linear Programming 2015
23
๏ฑ Min of piecewise linear convex functions
minimize ๐๐๐ฅ๐=1,โฆ,๐ (๐๐โฒ ๐ฅ + ๐๐ )
Subject to
๐ด๐ฅ โฅ ๐
minimize
Subject to
Linear Programming 2015
๐ง
๐ง โฅ ๐๐โฒ ๐ฅ + ๐๐ ,
๐ด๐ฅ โฅ ๐
๐ = 1, โฆ , ๐
24
๏ฑ Q: What can we do about finding maximum of a piecewise linear convex
function?
maximum of a piecewise linear concave function (can be obtained as
minimum of affine functions)?
Minimum of a piecewise linear concave function?
Linear Programming 2015
25
๏ฑ Convex function has a nice property such that a local minimum point is a
global minimum point. (when domain is ๐
๐ or convex set) (HW later)
Hence finding the minimum of a convex function defined over a convex set is
usually easy. But finding the maximum of a convex function is difficult to
solve. Basically, we need to examine all local maximum points.
Similarly, finding the maximum of a concave function is easy, but finding the
minimum of a concave function is difficult.
Linear Programming 2015
26
๏ฑ Suppose we have ๐(๐ฅ) โค โ in constraints, where ๐(๐ฅ) is a piecewise linear
convex function ๐ ๐ฅ = ๐๐๐ฅ๐=1,โฆ,๐ ๐๐โฒ ๐ฅ + ๐๐ .
๏ ๐๐โฒ ๐ฅ + ๐๐ โค โ,
๐ = 1, โฆ , ๐
Q: What about constraints ๐(๐ฅ) โฅ โ? Can it be modeled as LP?
๏ฑ Def: ๐: ๐
๐ โ ๐
, is a convex function, ๐ผ โ ๐
The set ๐ถ = {๐ฅ: ๐(๐ฅ) โค ๐ผ} is called the level set of ๐.
๏ฑ level set of a convex function is a convex set. (HW)
solution set of LP is convex (easy) ๏ฎ non-convex solution set canโt be
modeled as LP.
Linear Programming 2015
27
Problems involving absolute values
๏ฑ minimize
subject to
๐
๐=1 ๐๐ |๐ฅ๐ |
๐ด๐ฅ โฅ ๐
(assume ๐๐ โฅ 0)
More direct formulations than piecewise linear convex function is possible.
(1)
min ๐๐=1 ๐๐ ๐ง๐
subject to
๐ด๐ฅ โฅ ๐
๐ฅ๐ โค ๐ง๐ , ๐ = 1, โฆ , ๐
โ๐ฅ๐ โค ๐ง๐ , ๐ = 1, โฆ , ๐
Linear Programming 2015
(2)
min ๐๐=1 ๐๐ (๐ฅ๐+ + ๐ฅ๐โ )
subject to
๐ด๐ฅ + โ ๐ด๐ฅ โ โฅ ๐
๐ฅ+, ๐ฅโ โฅ 0
(want ๐ฅ๐+ = ๐ฅ๐ if ๐ฅ๐ โฅ 0, ๐ฅ๐โ = โ๐ฅ๐ if
๐ฅ๐ < 0 and ๐ฅ๐+ ๐ฅ๐โ = 0, i.e., at most one
of ๐ฅ๐+ , ๐ฅ๐โ is positive in an optimal
solution.
๐๐ โฅ 0 guarantees that.)
28
Data Fitting
๏ฑ Regression analysis using absolute value function
Given m data points ๐๐ , ๐๐ , ๐ = 1, โฆ , ๐, ๐๐ โ ๐
๐ , ๐๐ โ ๐
.
Want to find ๐ฅ โ ๐
๐ that predicts results ๐ given ๐ with function ๐ = ๐โฒ ๐ฅ.
Want ๐ฅ that minimizes the maximum prediction error |๐๐ โ ๐๐โฒ ๐ฅ| for all ๐.
minimize
subject to
๐ง
๐๐ โ ๐๐โฒ ๐ฅ โค ๐ง,
โ๐๐ + ๐๐โฒ ๐ฅ โค ๐ง,
๐ = 1, โฆ , ๐
๐ = 1, โฆ , ๐
(want approximate solution ๐ฅ of ๐๐โฒ ๐ฅ = ๐๐ , ๐ = 1, โฆ , ๐ (๐ด๐ฅ = ๐)
๏ ๐๐๐ max ๐๐ โ ๐๐โฒ ๐ฅ )
๐
Linear Programming 2015
29
๏ฑ Alternative criterion
โฒ
minimize ๐
๐=1 |๐๐ โ ๐๐ ๐ฅ|
minimize
subject to
๐ง1 + โฆ + ๐ง๐
๐๐ โ ๐๐โฒ ๐ฅ โค ๐ง๐ ,
โ๐๐ + ๐๐โฒ ๐ฅ โค ๐ง๐ ,
๐
๐ฟ๐ norm of ๐ฅ โ ๐
โก
๐
๐=1
๐ฅ๐
๐ = 1, โฆ , ๐
๐ = 1, โฆ , ๐
๐ 1/๐
Quadratic error function can't be modeled as LP, but need calculus method
(closed form solution)
Linear Programming 2015
30
๏ฑ Special case of piecewise linear objective function : separable piecewise
linear objective function.
function ๐: ๐
๐ โ ๐
, is called separable if ๐ ๐ฅ = ๐1 ๐ฅ1 + ๐2 ๐ฅ2 + โฆ +
๐๐ (๐ฅ๐ )
๐๐ (๐ฅ๐ )
๐1 < ๐2 < ๐3 < ๐4
๐4
Approximation of
nonlinear function.
๐3
slope: ๐๐
๐2
๐1
0
Linear Programming 2015
๐ฅ1๐
๐1
๐ฅ2๐
๐2
๐ฅ3๐
๐3
๐ฅ4๐
๐ฅ๐
31
๏ฑ Express variable ๐ฅ๐ in the constraints as ๐ฅ๐ โก ๐ฅ1๐ + ๐ฅ2๐ + ๐ฅ3๐ + ๐ฅ4๐ , where
0 โค ๐ฅ1๐ โค ๐1 , 0 โค ๐ฅ2๐ โค ๐2 โ ๐1 , 0 โค ๐ฅ3๐ โค ๐3 โ ๐2 , 0 โค ๐ฅ4๐
In the objective function, use :
min ๐1 ๐ฅ1๐ + ๐2 ๐ฅ2๐ + ๐3 ๐ฅ3๐ + ๐4 ๐ฅ4๐
Since we solve min problem, it is guaranteed that we get
๐ฅ๐๐ > 0 in an optimal solution implies ๐ฅ๐๐ , ๐ < ๐ have values at their
upper bounds.
Linear Programming 2015
32
1.4 Graphical representation and solution
๏ฑ Let ๐ โ ๐
๐ , ๐ โ ๐
.
Geometric intuition for the solution sets of
{๐ฅ: ๐โฒ ๐ฅ
{๐ฅ: ๐โฒ ๐ฅ
{๐ฅ: ๐โฒ ๐ฅ
{๐ฅ: ๐โฒ ๐ฅ
{๐ฅ: ๐โฒ ๐ฅ
{๐ฅ: ๐โฒ ๐ฅ
= 0}
โค 0}
โฅ 0}
= ๐}
โค ๐}
โฅ ๐}
Linear Programming 2015
33
๏ฑ Geometry in 2-D
{๐ฅ: ๐โฒ ๐ฅ โฅ 0}
๐
0
{ ๐ฅ โถ ๐โ๐ฅ ๏ฃ 0 }
Linear Programming 2015
{ ๐ฅ โถ ๐โ๐ฅ = 0 }
34
๏ฑ Let ๐ง be a (any) point satisfying ๐โฒ ๐ฅ = ๐. Then
๐ฅ: ๐โฒ ๐ฅ = ๐ = ๐ฅ: ๐โฒ ๐ฅ = ๐โฒ ๐ง = {๐ฅ: ๐โฒ ๐ฅ โ ๐ง = 0}
Hence ๐ฅ โ ๐ง = ๐ฆ, where ๐ฆ is any solution to ๐โฒ ๐ฆ = 0, and ๐ฅ = ๐ฆ + ๐ง.
Similarly, for {๐ฅ: ๐โฒ ๐ฅ โค ๐}, {๐ฅ: ๐โฒ ๐ฅ โฅ ๐}.
{๐ฅ: ๐โฒ ๐ฅ โฅ ๐}
๐
๐ง
{๐ฅ: ๐โฒ ๐ฅ โค ๐}
0
{๐ฅ: ๐โฒ ๐ฅ = ๐}
{๐ฅ: ๐โฒ ๐ฅ = 0}
Linear Programming 2015
35
๏ฑ min
s.t.
๐1 ๐ฅ1 + ๐2 ๐ฅ2
โ๐ฅ1 + ๐ฅ2 โค 1, ๐ฅ1 โฅ 0, ๐ฅ2 โฅ 0
๐ฅ2
๐ = (1, 0)
๐ = (โ1, โ1)
๐ = (1, 1)
๐ = (0, 1)
๐ฅ1
{๐ฅ: ๐ฅ1 + ๐ฅ2 = 0}
Linear Programming 2015
{๐ฅ: ๐ฅ1 + ๐ฅ2 = ๐ง}
36
๏ฑ Representing complex solution set in 2-D
{ ๐ variables, ๐ equations (row vectors of the ๐ด matrix in ๐ด๐ฅ = ๐ are
linearly independent), nonnegativity, and ๐ โ ๐ = 2 }
๐ฅ3
๐ฅ1 = 0
๐ฅ2
๐ฅ2 = 0
๐ฅ3 = 0
๐ฅ1
๏ฑ See text sec. 1.5, 1.6 for more backgrounds
Linear Programming 2015
37