Transcript Proposal

Preference Elicitation
in Scheduling Problems
Ulaş Bardak
Ph.D. Thesis Proposal
Committee
Jaime Carbonell, Eugene Fink, Stephen Smith,
Sven Koenig (University of Southern California)
Outline

Introduction

Example

Preliminary results

Plan of work
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Motivation
Improve resource planning by
reducing uncertainty of the
available knowledge.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Hypothesis
By asking the questions with the highest
potential to reduce uncertainty, we can
improve the quality of the resource plan
while minimizing the cost of elicitation.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Initial schedule
Available rooms:
Room
num.
1
2
3
Initial schedule:
Capacity Projector
500
100
80
Yes
No
Yes
Requests:
• Invited talk, 9–10am:
Needs big room
• Poster session, 9–11am:
Needs a room
1
Talk
2
Posters
3
Missing
info:
Assumptions:
•
• Invited
Invited talk:
talk:
–– Projector
need
Needs a projector
•
session:
• Poster
Poster session:
–– Room
size is OK
Small room
–– Projector
need
Needs no projector
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Choice of questions
Initial schedule:
2
1
Talk
Posters
3
Candidate questions:
Requests:
• Invited talk,
talk: 9–10am: Useless info: There are no
large rooms w/o a projector
× Needs a large
projector?
room
• Poster session,
session: 9–11am: Useless info: There are no
larger room? unoccupied larger rooms
× Needs a room
Potentially useful info
√ Needs a projector?
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Improved schedule
Requests:
• Invited talk, 9–10am:
Needs a large room
• Poster session, 9–11am:
Needs a room
Info elicitation:
System:
Does the poster session
need a projector?
User:
A projector may be useful,
but not really necessary.
Initial schedule:
2
1
Talk
Posters
3
New schedule:
2
1
Talk
Posters
3
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Architecture
Natural Lang.
Optimizer
Elicitor
Ask user and
get answers
Update
resource
allocation
Choose
and send
questions
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Inside the Elicitor
Get list of questions
Each uncertain variable is a potential question
For each question i get
utilities for possible
answers
Plug in possible answers to the utility function
to get change in utility.
Get question score
Score i    utility , i  cost  i 
Return top N questions
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Optimizer

Uses hill climbing to allocate resources

Searches for an assignment of resources
with the greatest expected utility
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Related Work

Example critiquing [Burke et al.]
 Have

Collaborative filtering [Resnick], [Hill et
al.]
 Have

the user rank related items
Similarity-based heuristics [Burke]
 Look

users tweak result set
at past similar user ratings
Focusing on targeted use [Stolze]
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Related Work
Clustering utility functions [Chajewska]
 Decision tree [Stolze and Ströbel]
 Min-max regret [Boutilier]

 Choose

question that reduces max regret
Auctions [Smith], [Boutilier], [Sandholm]
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
What is different?
No bootstrapping
 Continuous variables
 Large number of uncertain variables
 Tight integration with the optimizer
 Integration of multiple approaches
 Dynamic elicitation costs

Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Example Domain
Assigning rooms to conference sessions
 Rooms have properties.
 Sessions have preferences, constraints,
and importance values.
 Each preference is a function from a
room property to utility.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Example Domain

Rooms have properties.
Room 3 has one projector : 80% chance
Room 1 can accommodate 200 people.
Room 3 has no projectors : 20% chance
Sessions have preferences, constraints,
and importance values.
 Each preference is a function from a
room property to utility.

Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Example Domain

Rooms have properties.
Room 3 has one projector : 80% chance
Room 3 has no projectors : 20% chance

Sessions have preferences, constraints,
and importance values.
Invited talk very
cannot
important
be before 2 p.m.: 40% chance
Invited talk is
moderately
more important
important
than :poster
60% chance
session.

Each preference is a function from a
room property to utility.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Example Domain

Rooms have properties.
Room 3 has one projector : 80% chance
Capacity of Room 1 is 200.
Room 3 has no projectors : 20% chance

Sessions have preferences, constraints,
and importance values.
Invited talk very important
: 40% chance
Invited talk moderately important : 60% chance

Each preference is a function from a
room property to utility.
Capacity preference
preference:is150
[150,
people
200, is250]
minimum,: 40% chance
200 people
Capacity
preference
is acceptable,
is [50,250
100,
people
150] is best.: 60% chance
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Experiments
Evaluation of RADAR
 15 room properties
 88 rooms
 84 sessions
 2500 variables
 700 uncertain values
System asked to provide 50 top questions.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Incremental elicitation
0.78
0.72
Certain
Incremental
Utility
Optimizer
estimate
0.58
10
20
30
40
50
No. of Questions
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Completed work

Questions based on potential reduction of
uncertainty

Empirical evaluation

Integration with RADAR
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Contributions
Fast computation of expected impact for
potential questions
 Use of the optimizer for calculating more
accurate question weights.
 Use of past elicitation results to improve
the elicitation process.
 Unifying different elicitation strategies.

Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Search for optimal questions
Example: Uncertain room size
100-150:40% chance
151-200:60% chance
h=20, max utility increase = 20
160-200:50% chance
100-160:50% chance
h=10
h=10, max utility increase = 30
100-130:25% chance
h=15
130-160:25% chance
h=15, max utility
increase = 100
Best-first search with the optimizer used as the
heuristic function.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Elicitation rules
Encoding of elicitation heuristics
rule Uncertain-Auditorium-Size(room)
Conditions: type room   Auditorium
mean  size room   10, 000
std- dev  size room   5, 000
Action: elicit  size room 
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Learning of elicitation rules
Derive rules based on past elicitations
…
Session
Room
Event
Room
Mean Elicitation
Type Elicit. Importance Property Value Result
… EventType
Room
Mean
Prop.Talk Prop.
Result 110
… Imp. Invited
Auditorium
Proj.
0.5
+
… 110
…
… 115
…
… 105
…
… 90
…
… 200
…
… 150
Proj.
Posters
Size
Best Paper
Proj.
Posters
Proj.
Talk
Size
Keynote
Proj.
0.5
+
Meeting R.
250
Auditorium
0.9
+
Classroom
0.5
Auditorium
100
+
Auditorium
0.3
+
115
105
90
200
150
Size
250
-
rule Learned-Rule(room,event)
Conditions:
type
room   Auditorium
Proj.
0.9
+
Proj.
mean  projector room   1
0.5
-
0.3
+
importance event   100
Size elicit 100
Action:
projector room+
Proj.
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Dynamic question costs
Same cost for all questions
 Different cost for different question types
 Learning of the question costs for each
type
 Learning of the question costs for each
information source

Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Experiments
Compare different approaches:
 Current system
 Search for optimal questions
 Hand coded elicitation rules
 Learned elicitation rules
 Unified system
 Human elicitor
Measure utility gain after each answer; also
evaluate running time
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Timeline
Best-First Search
Syntax for rules
Learning of rules
Learning of costs
Unified System
Experiments
Writing
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Addendum
Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions
Absolute change in schedule quality
0.15
0.1
0.05
0
0
10
20
30
Question number (more important to less important)
40
50