Transcript slides
A Faster Algorithm for Linear Programming
and the Maximum Flow Problem I
Yin Tat Lee
(MIT, Simons)
Joint work with Aaron Sidford
THE PROBLEM
Linear Programming
Consider the linear program (LP)
min ๐ ๐ ๐ฅ
๐ด๐ฅโฅ๐
where ๐ด is a ๐ × ๐ matrix.
โข ๐ is the number of constraints.
โข ๐ is the number of variables.
๐ = 2, ๐ = 6
๐ = 2, ๐ = โ
๐ = # of constraints
๐ = # of variables
Previous Results
โข All of them are iterative methods.
โข Start with some initial point ๐ฅ.
โข While ๐ฅ is not optimal
โ Improve ๐ฅ
โข Time = (#๐๐ก๐๐) × (๐๐๐ ๐ก ๐๐๐ ๐๐ก๐๐)
โข This talk focus on #iter.
We call it efficient if
โข Polynomial time
โข Doesnโt use LP solver
๐ = # of constraints Previous
๐ = # of variables
(log(๐ โ1 )) is omitted)
Results (Selected)
Year
Author
# of iter
Cost per iter
Efficient steps
1947
Dantzig
2๐
Pivot
Yes
1979
Khachiyan
๐2
Update the Ellipsoid
Yes
1984
Karmarkar
๐
Solve linear systems
Yes
1986
Renegar
Solve linear systems
Yes
1989
Vaidya
Matrix inverse
Yes
1994
Nesterov,
Nemirovskii
Compute volume
No
๐( ๐)
Solve Linear Systems
Yes
๐( ๐๐๐๐)
Solve Linear Systems
Yes
2013
Lee, Sidford
๐
๐๐
1/4
๐
Remark: In 2013, Mฤ
dry shows how to obtain ๐3/7 iters for certain LPs!
Outline
๐ = # of constraints
๐ = # of variables
(log(๐ โ1 )) is omitted)
Year
Author
# of iter
Cost per iter
Efficient steps
1947
Dantzig
2๐
Pivot
Yes
1979
Khachiyan
๐2
Update the Ellipsoid
Yes
1984
Karmarkar
๐
Solve linear systems
Yes
1986
Renegar
Solve linear systems
Yes
1989
Vaidya
Matrix inverse
Yes
1994
Nesterov,
Nemirovskii
Compute volume
No
๐( ๐)
Solve Linear Systems
Yes
๐( ๐๐๐๐)
Solve Linear Systems
Yes
2013
Lee, Sidford
๐
๐๐
1/4
๐
Remark: In 2013, Mฤ
dry shows how to obtain ๐3/7 iters for certain LPs!
LP AND CENTER
A general framework
We can solve linear program by maintaining center.
Somehow, get a โcenterโ first
Put the cost constraint there
and move it.
Say we can move ๐
portion closer each time
After ๐(๐ โ1 ) steps, we are done.
A general framework
We can solve linear program by maintaining center.
Somehow, get a โcenterโ first
Put the cost constraint there
and move it.
Why center?
After ๐(๐
Say we can move ๐
portion closer each time
โ1
) steps, we are done.
What if we donโt try to maintain a center?
โข It is just like simplex method.
It is good now.
Oh, it touches. What to do?
Still good.
โฆ..
What if we donโt try to maintain a center?
โข It is just like simplex method.
It is good now.
Still good.
Avoid bad decision
by using global
โฆ..
Oh, it touches. What to do?
information!
A general framework
Formally, we have (say ๐๐๐ = 0):
โข ๐ก = 22014 . Find the center of ๐ ๐ ๐ฅ โค ๐ก, ๐ด๐ฅ โฅ ๐
โข While ๐ก is large
โ ๐ก โ ๐ก(1 โ ๐) for some fixed ๐ > 0
โ Update the center of ๐ ๐ ๐ฅ โค ๐ก, ๐ด๐ฅ โฅ ๐
This is called interior point method.
The initial point is easy:
min
๐ด๐ฅ ๐ +๐โฅ๐๐ ,๐โฅ0
22014 ๐ + ๐ ๐ ๐ฅ
A general way to define a center
Let ๐ be a smooth convex function on ฮฉ such that
โข ๐ ๐ฅ โ +โ as ๐ฅ โ ๐ฮฉ.
For example,
Standard log barrier: ps x = โ
ln ๐ด๐ฅ ๐ โ ๐๐ = โ
ln(๐ ๐ )
Center = argmin ๐ ๐ฅ
Barrier
Function
QUALITY OF A CENTER
Rounding
โข Assume center is induced by some barrier function ๐.
โข Look at the ellipsoid ๐ธ induced by ๐ at the center ๐ฅ.
โข Call ๐ธ is ๐ rounding if ๐ ๐ธ โ ฮฉ โ ๐๐ ๐ธ for some ๐ .
Self concordant barrier
โข ๐ is a ๐-self concordant barrier function for ฮฉ if
โ ๐ is smooth.
โ ๐ gives ๐ rounding.
๐ is not smooth enough
Bad rounding.
Rounding Algorithm
For general barrier function ๐:
โข Repeat
โ Tighten the cost constraint
โ Maintain the rounding ellipsoid induced by ๐.
Why ๐ iterations?
Why ๐ iterations?
Think ๐ ๐ฅ = โ ln ๐ ๐ .
โข Newton Method (Using smoothness)
Given ๐ ๐ฆ โ min ๐ (๐ฅ) < 0.5, we can find the center in ๐(1)
steps.
x
Why ๐ iterations?
Let ๐ฆ be the old center. Using the smoothness, we have
๐ก๐๐๐ค โ ๐ ๐ ๐ฆ 1
๐๐๐๐ค ๐ฆ โ min ๐๐๐๐ค ๐ฅ โค
โค .
๐
x
๐กโ๐ ๐ฆ
2
Why ๐ iterations?
So, we need
๐ก๐๐๐ค โ ๐ ๐ ๐ฆ 1
โค .
๐
๐กโ๐ ๐ฆ
2
It takes ๐ ๐ iters.
Why
๐ iterations?
โข We can reduce the gap by 1/ ๐.
Roughly Speaking:
Smoothness + ๐ rounding gives
๐log(๐ โ1 ) iterations LP solvers.
Quality of analytic center is arbitrary bad in ๐!
โข Recall the standard log barrier function
๐
๐๐ ๐ฅ = โ
๐
ln ๐ด๐ฅ ๐ โ ๐๐ = โ
๐=1
ln ๐ ๐ .
๐=1
โข The center ๐ฅ = argmin๐ฆ ๐ ๐ฆ is called analytic center.
Is it tight?
โข In practice, it takes โค 60 steps.
โข Mizuno, Todd, Ye showed it is โusuallyโ correct on first step.
โข In 2014, Mut and Terlaky showed an example really takes
ฮฉ ๐ log ๐ โ1 iterations where ๐ is exponential in ๐.
UNIVERSAL BARRIER FUNCTION
Universal Barrier Function
Theorem [NN94]: For any convex set ฮฉ โ ๐
๐ ,
๐ ๐ฅ = โlog ๐ฃ๐๐( ฮฉ โ ๐ฅ ๐ )
is a ๐(๐)-self concordant barrier function.
โSmallerโ set has larger polar. Hence, ๐ โ โ as ๐ฅ โ ๐ฮฉ
Note that ๐ป 2 ๐ ~ ๐๐๐๐๐๐ ๐๐๐๐๐๐ก ๐๐ ฮฉ โ ๐ฅ ๐ .
Kannan-Lovasz-Simonovits Lemma: For any convex set ฮฉ, the
second moment matrix
๐ฅ๐ฅ ๐
๐(ฮฉ) =
gives a ๐(๐) rounding of ฮฉ.
ฮฉ
The cost of Universal Barrier
โข To get second moment matrix, you need ๐๐ 1 sampling.
โข To get 1 sampling, you need to do ๐๐(1) iters of Markov chain.
โข To do 1 iter of Markov chain, you need to implement
separation oracle for ฮฉ โ ๐ฅ ๐ .
โข If ฮฉ = {๐ด๐ฅ โฅ ๐}, one need to solve an LP.
Hence, one iteration requires solving ๐๐(1) many LPs.
The problem:
Get an efficient ๐(๐) self concordant barrier
function.
VOLUMETRIC BARRIER
FUNCTION
Volumetric Barrier Function
In 1989, Vaidya showed
1
๐
2
๐๐ฃ ๐ฅ = log det ๐ป ๐๐ (๐ฅ) + ๐๐ (๐ฅ)
2
๐
where ๐๐ ๐ฅ = โ ๐ ln ๐ ๐ . Why it is volumetric?
For example:
It is a ๐๐
1/2
barrier.
Volumetric Barrier
Log Barrier
Why Volumetric is good?
1
๐
2
๐๐ฃ ๐ฅ = log det ๐ป ๐๐ (๐ฅ) + ๐๐ (๐ฅ)
2
๐
Around ๐ฆ, we have
๐๐ ๐๐ฆโ1 ๐ด +
๐๐ฃ ๐ฅ ~ โ
๐
where
๐๐ ๐ต = ๐๐๐ ๐ต ๐ต๐ ๐ต
๐
log๐ ๐ (๐ฅ)
๐
โ1 ๐ต๐ .
๐
1 0
2 + 32
1
0 . ๐ = 1 , ๐ = 9 , ๐ = 1.
๐
Example: ๐ต = 3 0 . Then, ๐ต ๐ต =
1
10 2
10 3
0
22
0 2
In general, ๐๐ = ๐, 0 โค ๐๐ โค 1, if the ๐ ๐กโ row is repeated, ๐๐ is decreased by 2.
For [0,1] interval with 0 repeated ๐ times:
1
โ1
๐ ๐ด= 1
โฎ
๐1/3
๐ โ1 ๐ด =
1
โฎ
OUR BARRIER FUNCTION
Repeated Volumetric Barrier Function
1
๐
2
๐ฅ = log det ๐ป ๐๐ (๐ฅ) + ๐๐ (๐ฅ)
2
๐
1
๐ (๐)
2
(๐)
log det ๐ป ๐ (๐ฅ) + ๐ (๐ฅ)?
2
๐
๐(1)
How about ๐(๐+1) ๐ฅ =
Suppose ๐(๐) ๐ฅ = โ
๐(๐+1) ๐ฅ ~ โ
So, we have
(๐+1)
๐ค๐
(๐)
๐ ๐๐
log๐ ๐ , around ๐ฆ, we have
๐
โ1
(๐)
๐๐ ๐ ๐๐ฆ ๐ด +
log๐ ๐ ๐ฅ .
๐
๐
What
is that?
= ๐๐
We call ๐(โ) ๐ฅ = โ ๐ค๐
๐ค๐
โ
โ
(๐)
๐ (๐) ๐๐ฆโ1 ๐ด + ๐ค๐ .
(๐ฅ) log ๐ ๐ where ๐ค๐
(๐ฅ) = ๐๐
โ
๐ (โ) ๐๐ฅโ1 ๐ด .
satisfies
What is that weight?
1
2
โข Let ๐๐ = ๐๐ (๐ ๐ด)/๐ค๐ .
๐ค๐
โ
(๐ฅ) = ๐๐
๐ (โ) ๐๐ฅโ1 ๐ด .
If ๐๐ โค 1 for all ๐,
the ellipsoid is inside.
(โ)
The ๐ค๐
represents John
ellipsoid of { ๐๐ฅโ1 ๐ด โ โค 1}
Our Condition
(John Ellipsoid):
๐๐ = 1 if ๐ค๐ โ 0.
Repeated Volumetric Barrier Function
โข Recall
๐(โ) ๐ฅ ~ ln det ๐ป 2 ๐
We get
โ
๐ฅ
~ โ ln det ๐ด๐๐ฅโ1 ๐ (โ) ๐๐ฅโ1 ๐ด
๐(โ) ~ โ ln vol ๐ฝ๐โ๐๐ธ๐๐๐๐๐ ๐๐๐ ฮฉ โฉ (2x โ ฮฉ) .
Symmetrize
Find John Ellipsoid
The barrier function is not perfect!
โข The path is piecewise smooth because it may not touch every
constraint.
โข ๐(โ) =
max
๐ค=๐,๐คโฅ0
lndet(๐ด๐ ๐ โ1 ๐๐ โ1 ๐ด)
Our Barrier Function
โข Standard Log Barrier:
๐๐ ๐ฅ = โ ๐๐๐ ๐
โข Volumetric Barrier:
1
๐
2
๐๐ฃ ๐ฅ = log det ๐ป ๐๐ (๐ฅ) + ๐๐ (๐ฅ)
2
๐
โข John Ellipsoid Barrier:
๐ โ (๐ฅ) = max lndet(๐ด๐ ๐ โ1 ๐๐ โ1 ๐ด)
๐ค=๐,๐คโฅ0
โข Regularized John Ellipsoid Barrier (1):
โ1 (๐)
1โlog
๐ ๐ โ1 ๐ด)
max lndet(๐ด๐ ๐ โ1 ๐
๐คโฅ0
๐
+
๐๐๐ค๐ โ ๐ค๐
๐
โข Regularized John Ellipsoid Barrier (2):
๐
๐
๐
โ1
โ1
max ln det ๐ด ๐ ๐๐ ๐ด โ
๐ค๐ ln๐ค๐ โ
ln๐ ๐
๐คโฅ0
๐
๐
๐๐ Lewis Weight
We call ๐ค
is ๐ ๐
๐ โ1
Lewis weight for ๐ต if
max lndet(๐ด ๐
๐คโฅ0
๐
1โlogโ1 ( ) โ1
๐ ๐ ๐ด) .
๐
1 1
โ๐
2
(๐
๐ต)
๐ค๐ = ๐๐
Thanks to Cohen and Peng, we know
โข Let ๐ถ be
๐
max 1, 2
๐(๐
) rows sample of ๐ต accordingly to ๐ค๐ ,
๐ถ๐ง ๐ ~ ๐ต๐ง ๐ โ๐ง.
2
1โ๐
โข ๐ = ๐ต๐ ๐ ๐ต is the maximizer of
โ๐๐๐ det ๐ ๐ ๐ข๐๐๐๐๐ก ๐ก๐ ๐ผ๐ฅ ๐ ๐๐ฅโค1 ๐ต๐ฅ
๐
๐
โค1
i.e, the maximum ellipsoid such that it โinsidesโ the polytopes.
โข For ๐ = โ, {๐ฅ ๐ ๐๐ฅ โค 1} is the John ellipsoid for {|๐ต๐ฅ| โค 1}.
Computing ๐๐ Lewis Weight
โข Cohen and Peng showed how to compute it when ๐ < 4.
โข The repeated
(1)
volumetric barrier: ๐ค๐ = ๐/๐,
(๐+1)
๐
๐ค๐
= ๐๐ ๐ (๐) ๐ต + ๐ค๐ .
(log(๐))
After renormalization, ๐ค๐
gives โ๐โ โLewisโ weight:
1
~๐๐ (๐ 2 ๐ต).
๐ค๐ ๐ฅ
โข Cohen, L., Peng, Sidford shows that in fact a similar algorithm
find constant โapproximateโ ๐๐ Lewis weight for ๐ > 2 in ๐(1).
CONCLUSION
๐ = # of constraints
๐ = # of variables
Our Barrier
Given any polytope {Ax โฅ ๐}, let
๐
๐
๐ ๐ฅ =
โ
๐ค๐ ln๐ค๐ โ
ln๐ ๐ .
๐
๐
Theorem: The barrier function ๐ gives ๐( ๐ log ๐ โ1 ) iterations
algorithm for LP of the form
min ๐ ๐ ๐ฅ .
max ln det ๐ด๐ ๐ โ1 ๐๐ โ1 ๐ด
๐คโฅ0
๐ด๐ฅโฅ๐
Algorithm:
โข While
โ Move the cost constraint
โ Maintain the regularized John ellipsoid
๐ = # of constraints
๐ = # of variables
Howeverโฆ
โข My goal is
to design general LP algo fast enough to
beat the best maxflow algorithm!
โข We obtained
min ๐ ๐ ๐ฅ
๐ด๐ฅโฅ๐
Compute ๐ด๐ ๐ท๐ ๐ด
๐
๐ log ๐ โ1
โ1