www.cs.wustl.edu

Download Report

Transcript www.cs.wustl.edu

Fixed-size Minimax for Committee Elections:
Approximation and Local Search Heuristics
COMSOC ’06
6 December 2006
Rob LeGrand
Evangelos Markakis
Washington University in St. Louis
University of Toronto
[email protected]
[email protected]
Aranyak Mehta
IBM Almaden Research Center
[email protected]
Electing a committee from approval ballots
n = 5 candidates
11110
approves of
candidates
4 and 5
00011
m = 6 ballots
01111
00111
10111
00001
•What’s the best committee of size k = 2?
2
Sum of Hamming distances
11110
k = 2 winners
00011
2
01111
4
4
5
11000
4
10111
3
00111
sum = 22
00001
3
Fixed-size minisum
11110
k = 2 winners
00011
4
01111
2
0
1
00011
2
10111
1
00111
sum = 10
00001
•Minisum elects winner set with smallest sumscore
•Easy to compute (pick candidates with most approvals)
4
Maximum Hamming distance
11110
k = 2 winners
00011
4
01111
2
0
1
00011
2
10111
1
00111
sum = 10
max = 4
00001
5
Fixed-size minimax
[Brams, Kilgour & Sanver, ’04]
11110
k = 2 winners
00011
2
01111
2
2
1
00110
2
10111
3
00111
sum = 12
max = 3
00001
•Minimax elects winner set with smallest maxscore
•Harder to compute?
6
Complexity
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
NP-hard
NP-hard
[Frances & Litman, ’97]
(generalization of EM)
Fixed-size minimax
= FSM(k) = BSM(k, k)
?
7
Complexity
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
NP-hard
NP-hard
NP-hard
[Frances & Litman, ’97]
(generalization of EM)
(this paper)
8
Approximability
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
has a PTAS*
no known PTAS; no known PTAS;
[Li, Ma & Wang, ’99]
no known constant- no known constantfactor approx.
factor approx.
* Polynomial-Time Approximation Scheme: algorithm
with approx. ratio 1 + ε that runs in time polynomial in
the input and exponential in 1/ε
9
Approximability
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
has a PTAS*
no known PTAS;
no known PTAS;
has a 3-approx.
has a 3-approx.
(this paper)
(this paper)
[Li, Ma & Wang, ’99]
* Polynomial-Time Approximation Scheme: algorithm
with approx. ratio 1 + ε that runs in time polynomial in
the input and exponential in 1/ε
10
Approximating FSM
11110
k = 2 winners
00011
00111
00111
00001
10111
choose
a ballot
arbitrarily
01111
11
Approximating FSM
11110
k = 2 winners
00011
00111
00111
00001
10111
01111
choose
a ballot
arbitrarily
coerce to
size k
00101
outcome =
k-completed ballot
12
Approximation ratio ≤ 3
11110
00011
2
optimal
FSM set
2
00111 1
00001
3
00110
2
10111
2
01111
≤ OPT
OPT = optimal maxscore
13
Approximation ratio ≤ 3
11110
00011
2
optimal
FSM set
2
00111 1
00001
chosen
ballot
3
00110
1
00111
2
10111
2
01111
≤ OPT
≤ OPT
OPT = optimal maxscore
14
Approximation ratio ≤ 3
11110
00011
2
optimal
FSM set
k-completed
ballot
2
00111 1
00001
chosen
ballot
3
00110
1
00111
1
00011
2
10111
2
01111
≤ OPT
OPT = optimal maxscore
≤ OPT
≤ 3·OPT
≤ OPT
(by triangle inequality)
15
Better in practice?
• So far, we can guarantee a winner set no more than 3 times
as bad as the optimal.
– Nice in theory . . .
• How can we do better in practice?
– Try local search
16
Local search approach for FSM
1.
Start with some c  {0,1}n
of weight k
01001
4
17
Local search approach for FSM
1.
2.
Start with some c  {0,1}n
of weight k
In c, swap up to r 0-bits
with 1-bits in such a way
that minimizes the
maxscore of the result
11000
5
01100
4
10001
4
01001
4
01010
4
00101
4
00011
4
18
Local search approach for FSM
1.
2.
Start with some c  {0,1}n
of weight k
In c, swap up to r 0-bits
with 1-bits in such a way
that minimizes the
maxscore of the result
01010
4
19
Local search approach for FSM
1.
2.
Start with some c  {0,1}n
of weight k
In c, swap up to r 0-bits
with 1-bits in such a way
that minimizes the
maxscore of the result
01010
4
20
Local search approach for FSM
1.
2.
3.
4.
Start with some c  {0,1}n
of weight k
In c, swap up to r 0-bits
with 1-bits in such a way
that minimizes the
maxscore of the result
Repeat step 2 until
maxscore(c) is
unchanged n times
Take c as the solution
11000
5
01100
4
10010
4
01010
4
01001
4
00110
3
00011
4
21
Local search approach for FSM
1.
2.
3.
4.
Start with some c  {0,1}n
of weight k
In c, swap up to r 0-bits
with 1-bits in such a way
that minimizes the
maxscore of the result
Repeat step 2 until
maxscore(c) is
unchanged n times
Take c as the solution
00110
3
22
Specific FSM heuristics
•
Two parameters:
–
where to start vector c:
1. a fixed-size-minisum solution
2. a k-completion of a ballot (3-approx.)
3. a random set of k candidates
4. a k-completion of a ballot with highest maxscore
– radius of neighborhood r: 1 and 2
23
Heuristic evaluation
•
•
•
•
Real-world ballots from GTS 2003 council election
Found exact minimax solution
Ran each heuristic 5000 times
Compared exact minimax solution with heuristics to find
realized approximation ratios
– example: 15/14 = 1.0714
• maxscore of solution found = 15
• maxscore of exact solution = 14
• We also performed experiments using ballots generated
according to random distributions (see paper)
24
Average approx. ratios found
radius = 1
radius = 2
fixed-size
minimax
1.0012
1.0000
3-approx.
1.0017
1.0000
random
set
1.0057
1.0000
highestmaxscore
1.0059
1.0000
performance on GTS ’03 election data
n = 24 candidates, k = 12 winners, m = 161 ballots
25
Largest approx. ratios found
radius = 1
radius = 2
fixed-size
minimax
1.0714
1.0000
3-approx.
1.0714
1.0000
random
set
1.0714
1.0000
highestmaxscore
1.0714
1.0000
performance on GTS ’03 election data
n = 24 candidates, k = 12 winners, m = 161 ballots
26
Conclusions from all experiments
• All heuristics perform near-optimally
– highest ratio found: 1.2
– highest average ratio < 1.04
• When radius is larger, performance improves and running
time increases
• The fixed-size-minisum starting point performs best overall
(with our 3-approx. a close second)
27
Manipulating FSM
00110
00011
2
2
01111
0
1
00011
2
00111
1
max = 2
10111
00001
•Voters are sincere
•Another optimal solution: 00101
28
Manipulating FSM
00110
11110
00011
0
01111
2
2
2
1
00110
2
00111
3
max = 3
10111
00001
•A voter manipulates and realizes ideal outcome
29
Nonmanipulable “FSM”?
Electing a set found using our 3-approximation for FSM
gives a nonmanipulable procedure:
• For the voters whose ballots are not chosen,
voting insincerely cannot affect the outcome
• For the voter whose ballot is chosen,
the outcome will be one of the sets of size k closest to the
voter’s wishes
30
Conclusions
• BSM and FSM are NP-hard
• Both can be approximated with ratio 3
• Polynomial-time local search heuristics perform
well in practice
– some retain ratio-3 guarantee
• Exact FSM can be manipulated
• Our 3-approximation for FSM is nonmanipulable
31
Future work
• Investigate weighted version of minimax [Brams, Kilgour &
Sanver, ’06]
• What is the best approximation ratio for FSM achievable in
polynomial time? (Is there a PTAS?)
• What is the nonmanipulable FSM approximation algorithm
with the best ratio?
Thanks!
32