1 / 30

Preference Elicitation in Scheduling Problems

This Ph.D. thesis proposal outlines a method to reduce uncertainty in resource planning by asking targeted questions. By eliciting useful information, the resource plan quality improves while keeping costs low.

louden
Download Presentation

Preference Elicitation in Scheduling Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preference Elicitation in Scheduling Problems Committee Jaime Carbonell, Eugene Fink, Stephen Smith, Sven Koenig (University of Southern California) Ulaş Bardak Ph.D. Thesis Proposal

  2. Outline • Introduction • Example • Preliminary results • Plan of work Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  3. Motivation Improve resource planning by reducing uncertainty of the available knowledge. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  4. Hypothesis By asking the questions with the highest potential to reduce uncertainty, we can improve the quality of the resource plan while minimizing the cost of elicitation. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  5. Initial schedule: Posters Talk Initial schedule Available rooms: 2 1 3 • Requests: • Invited talk, 9–10am: Needs big room • Poster session, 9–11am: Needs a room • Missing info: • Invited talk: – Projector need • Poster session: – Room size – Projector need • Assumptions: • Invited talk: – Needs a projector • Poster session: – Small room is OK – Needs no projector Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  6. Useless info: There are no large rooms w/o a projector × Useless info: There are no unoccupied larger rooms × √ Potentially useful info Choice of questions Initial schedule: 2 1 Posters 3 Talk • Candidate questions: • Invited talk: Needs a projector? • Poster session:Needs a larger room? Needs a projector? • Requests: • Invited talk, 9–10am: Needs a large room • Poster session, 9–11am: Needs a room Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  7. New schedule: 2 1 3 Talk Improved schedule • Requests: • Invited talk, 9–10am: Needs a large room • Poster session, 9–11am: Needs a room Initial schedule: 2 1 Posters 3 Talk Info elicitation: System: Does the poster sessionneed a projector? Posters User:A projector may be useful,but not really necessary. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  8. Updateresourceallocation Architecture Natural Lang. Optimizer Elicitor Ask user and get answers Chooseand sendquestions Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  9. Inside the Elicitor Each uncertain variable is a potential question Get list of questions For each question iget utilities for possible answers Plug in possible answers to the utility function to get change in utility. Get question score Return top N questions Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  10. Optimizer • Uses hill climbing to allocate resources • Searches for an assignment of resources with the greatest expected utility Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  11. Related Work • Example critiquing [Burke et al.] • Have users tweak result set • Collaborative filtering [Resnick], [Hill et al.] • Have the user rank related items • Similarity-based heuristics [Burke] • Look at past similar user ratings • Focusing on targeted use [Stolze] Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  12. Related Work • Clustering utility functions [Chajewska] • Decision tree [Stolze and Ströbel] • Min-max regret [Boutilier] • Choose question that reduces max regret • Auctions [Smith], [Boutilier], [Sandholm] Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  13. What is different? • No bootstrapping • Continuous variables • Large number of uncertain variables • Tight integration with the optimizer • Integration of multiple approaches • Dynamic elicitation costs Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  14. Example Domain Assigning rooms to conference sessions • Rooms have properties. • Sessions have preferences, constraints, and importance values. • Each preference is a function from a room property to utility. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  15. Example Domain • Rooms have properties. • Sessions have preferences, constraints, and importance values. • Each preference is a function from a room property to utility. Room 1 can accommodate 200 people. Room 3 has one projector : 80% chance Room 3 has no projectors : 20% chance Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  16. Example Domain • Rooms have properties. • Sessions have preferences, constraints, and importance values. • Each preference is a function from a room property to utility. Room 3 has one projector : 80% chance Room 3 has no projectors : 20% chance Invited talk cannot be before 2 p.m. Invited talk is more important than poster session. Invited talk very important : 40% chance Invited talk moderately important : 60% chance Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  17. Example Domain • Rooms have properties. • Sessions have preferences, constraints, and importance values. • Each preference is a function from a room property to utility. Capacity of Room 1 is 200. Room 3 has one projector : 80% chance Room 3 has no projectors : 20% chance Invited talk very important : 40% chance Invited talk moderately important : 60% chance Capacity preference: 150 people is minimum, 200 people is acceptable, 250 people is best. Capacity preference is [150, 200, 250] : 40% chance Capacity preference is [50, 100, 150] : 60% chance Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  18. Experiments Evaluation of RADAR • 15 room properties • 88 rooms • 84 sessions • 2500 variables • 700 uncertain values System asked to provide 50 top questions. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  19. Incremental elicitation 0.78 0.72 Certain Incremental Utility Optimizer estimate 0.58 10 20 30 40 50 No.of Questions Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  20. Completed work • Questions based on potential reduction of uncertainty • Empirical evaluation • Integration with RADAR Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  21. Contributions √ • Fast computation of expected impact for potential questions • Use of the optimizer for calculating more accurate question weights. • Use of past elicitation results to improve the elicitation process. • Unifying different elicitation strategies. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  22. Search for optimal questions Example: Uncertain room size 100-150: 40% chance 151-200: 60% chance h=20, max utility increase = 20 160-200: 50% chance 100-160: 50% chance h=10 h=10, max utility increase = 30 100-130: 25% chance 130-160: 25% chance h=15 h=15, max utility increase = 100 Best-first search with the optimizer used as the heuristic function. Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  23. rule Uncertain-Auditorium-Size(room) Elicitation rules Encoding of elicitation heuristics Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  24. rule Learned-Rule(room,event) Learning of elicitation rules Derive rules based on past elicitations Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  25. Dynamic question costs √ • Same cost for all questions • Different cost for different question types • Learning of the question costs for each type • Learning of the question costs for each information source √ Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  26. Experiments Compare different approaches: • Current system • Search for optimal questions • Hand coded elicitation rules • Learned elicitation rules • Unified system • Human elicitor Measure utility gain after each answer; also evaluate running time Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  27. Timeline Best-First Search Syntax for rules Learning of rules Learning of costs Unified System Experiments Writing Dec 2007 Aug 2007 July 2006 Nov 2006 Mar 2007 Mar 2006 Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

  28. Addendum Outline – Introduction – Example – Preliminary Results – Plan of Work – Questions

More Related