1 / 51

Information Elicitation in Scheduling Problems

Explore the significance of information elicitation in scheduling problems to improve optimization results by reducing uncertainty, with a focus on resource allocation and preference identification. Learn how asking the right questions can lead to more efficient schedules.

elishag
Download Presentation

Information Elicitation in Scheduling Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Elicitation in Scheduling Problems Committee Jaime Carbonell (chair), Eugene Fink, Stephen Smith, and Sven Koenig (University of Southern California) Ulaş Bardak Ph.D. Thesis Defense

  2. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Outline • Introduction • Related work • Domain • Optimization • Elicitation • Evaluation • Conclusions Ulas Bardak - Thesis Defense

  3. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion What is information elicitation? For example… Ulas Bardak - Thesis Defense

  4. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Why elicitation? • Scheduling problems include information about resources, constraints, and preferences • Uncertain information can lower the quality of schedules • We need to select and ask questions that help to reduce uncertainty Ulas Bardak - Thesis Defense

  5. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Example problem • We are organizing a small conference,using three available rooms • We have incomplete information about speaker needs Ulas Bardak - Thesis Defense

  6. Initial schedule: Posters Talk Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Initial schedule Available rooms: 1 2 3 Requests (by importance): • Invited talk, 9–10am: Needs a large room • Poster session, 9-11am: Needs a room Missing info: • Invited talk: – Projector need • Poster session: – Room size – Projector need Assumptions: • Invited talk: – Needs a projector • Poster session: – Smaller room is OK – Needs no projector Ulas Bardak - Thesis Defense

  7. Useless info: There are no large rooms w/o a projector × Potentially useful info √ √ Potentially useful info Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Choice of questions Initial schedule: 1 2 Posters 3 Talk Candidate questions: • Invited talk: Needs a projector? • Poster session:How big a room? Needs a projector? Requests: • Invited talk, 9–10am: Needs a large room • Poster session, 9–11am: Needs a room Ulas Bardak - Thesis Defense

  8. New schedule: 2 1 3 Talk Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Improved schedule Requests: • Invited talk, 9–10am: Needs a large room • Poster session, 9–11am: Needs a room Initial schedule: 1 2 Posters 3 Talk Info elicitation: System: Does the poster sessionneed a projector? How big a room does it need? Posters User:A projector may be useful.A small room is OK. Ulas Bardak - Thesis Defense

  9. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Motivation Improve optimization results by reducing uncertainty of the available knowledge. Ulas Bardak - Thesis Defense

  10. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Related work • Example critiquing (Burke) • Have users tweak result set • Collaborative filtering (Resnick and Hill) • Have the user rank related items • Similarity-based heuristics (Burke) • Look at past similar user ratings • Focusing on targeted use (Stolze) Ulas Bardak - Thesis Defense

  11. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Related work • Clustering utility functions (Chajewska) • Decision tree (Stolze and Ströbel) • Min-max regret (Boutilier) • Choose question that reduces max regret • Auctions (Smith, Boutilier, and Sandholm) Ulas Bardak - Thesis Defense

  12. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion What is different? • No bootstrapping • Both continuous and discrete variables • Large number of uncertain variables • Tight integration with the optimizer • Synergy of multiple approaches Ulas Bardak - Thesis Defense

  13. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Explored domains • An academic conference • Assigning rooms to sessions • Placing vendor orders • Assigning orders to sessions • Social networking • Matching users to other users Ulas Bardak - Thesis Defense

  14. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Selected domain • Scheduling a conference • Rooms are our resources • We need to assign rooms to sessions Ulas Bardak - Thesis Defense

  15. Manual operations • Edit resources and constraints • Modify the schedule • Provide advice to the system return the control to the user invoke theauto scheduling Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Collaborative scheduling Automatic operations Ulas Bardak - Thesis Defense

  16. Optimize schedule Generate and send questionsto the user Manual operations • Edit resources and constraints • Modify the schedule • Provide advice to the system Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Collaborative scheduling Automatic operations Automatic operations Process new dataand advice return the control to the user invoke theauto scheduling Ulas Bardak - Thesis Defense

  17. Representation Optimizer Info elicitor Optimize theschedule Choosequestions Graphicaluser interface Administrator Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Architecture Top-level control and learning Processnew info Ulas Bardak - Thesis Defense

  18. 2000 ft2 Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Rooms 2 1 3 • Rooms have a set of properties • Size, seating capacity,... • Microphones, projectors,... • We also know distances between rooms Room 1 is 2000 square feet and has one projector. Room 1 is 400 feet away from Room 3. Ulas Bardak - Thesis Defense

  19. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Sessions Session description includes • Importance • Hard constraints, such as the minimal acceptable room size • Soft preferences, such as the desired room size The invited talk is more important than the poster session. The assigned room has to be at least 500 square feet, and preferably 1000 square feet. Ulas Bardak - Thesis Defense

  20. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Sessions We represent preferences by piecewise-linear functions. 1.0 Quality 0.5 0 Room size 1000 250 500 750 Unacceptable The invited talk is more important than the poster session. The assigned room has to be at least 500 square feet, and preferably 1000 square feet. Ulas Bardak - Thesis Defense

  21. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Uncertainty We usually have incomplete knowledge of room properties, session importances, and constraints and preferences. Ulas Bardak - Thesis Defense

  22. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Uncertain properties We represent an uncertain value as either • a completely unknown value, or • a probability density function, approximated by a set of uniform distributions. Ulas Bardak - Thesis Defense

  23. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Uncertain properties Example: An auditorium has about 600 seats. 0.2 chance: [450..549] 0.6 chance: [550..650] 0.2 chance: [651..750] Probability 0.006 .6 0.004 0.002 .2 .2 0 0 200 400 600 800 Capacity Ulas Bardak - Thesis Defense

  24. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Uncertain preferences We represent an uncertain preference as • completely unknown function, • piecewise-linear function with uncertain y-coordinates of endpoints, or • set of possible piecewise-linear functions with related probabilities. Ulas Bardak - Thesis Defense

  25. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Uncertain preferences The description of a demo session does not include a room-size preference. 1.0 .95 chance .05 chance Quality 0.5 0 Room size 1000 250 500 750 Unacceptable Demo sessions usually require at least 250 square feet, and preferably 750 square feet; however, there is a 5% chance that a big sponsorshows up unexpectedly and asks for additional 250 square feet. Ulas Bardak - Thesis Defense

  26. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Optimization The optimizer assigns rooms to sessions. • Input: Rooms and sessions • Output: Room and time for each session Ulas Bardak - Thesis Defense

  27. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Session quality • Quality value of a session is based on how much each preference is satisfied • Uncertainty is taken into account when calculating quality Ulas Bardak - Thesis Defense

  28. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Schedule quality • Overall schedule quality value is a weighted sum of session quality values • If any session violates hard constraints, the whole schedule is unacceptable Ulas Bardak - Thesis Defense

  29. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Optimizer • Simple version is based on hill-climbing • Advanced version uses randomized hill-climbing, similar to simulated annealing Ulas Bardak - Thesis Defense

  30. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Elicitation • We use elicitation to reduce uncertainty • User can selectively answer any questions Ulas Bardak - Thesis Defense

  31. Heuristic Elicitor Rule-based Elicitor Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Elicitation Synergetic Elicitor Heuristic Elicitor Rule-based Elicitor Search Elicitor All Potential Questions Merged List Re-ranked List Ulas Bardak - Thesis Defense

  32. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Heuristic elicitor Synergetic Elicitor Heuristic Elicitor Rule-based Elicitor Search Elicitor • Selection of questions based on the standard deviation of schedule quality • Fast calculation, once per variable • Domain-independent Ulas Bardak - Thesis Defense

  33. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Heuristic elicitor Each uncertain variable is a potential question Get list of questions For each question, determine impact on schedule quality of possible answers Plug in possible answers to the quality function to get change in schedule quality For each question, calc. question score Return top questions Ulas Bardak - Thesis Defense

  34. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Rule-based elicitor Synergetic Elicitor Heuristic Elicitor Rule-based Elicitor Search Elicitor Selection of additional questions, based on domain-specific heuristics, such as “Room capacity is more important than ceiling height.” Ulas Bardak - Thesis Defense

  35. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Search elicitor Synergetic Elicitor Heuristic Elicitor Rule-based Elicitor Search Elicitor • Ranks selected questions using B* search • Relies on the optimizer for evaluating nodes in the search space • Domain-independent and optimizer-independent Ulas Bardak - Thesis Defense

  36. 0 0.2 0.3 0.1 0.5 0.4 100-150:40% 151-200:60% 0-1:50% 2-3:50% Min qual.:0.1 Max qual.:0.5 Min qual.:0 Max qual.:0.4 160-200:50% 100-160:50% 0-1:50% 2-3:50% Min qual.:0.15 Max qual.:0.35 Min qual.:0.1 Max qual.:0.25 100-120:25% 120-160:25% Min qual.:0.28 Max qual.:0.33 Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Example Uncertain room size versus uncertain projector number The minimal possible utility of asking about the room size is greater than the maximal possible utility of asking about the number of projectors. Ulas Bardak - Thesis Defense

  37. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation The synergetic elicitor is far more effective than each of its individual components, simple heuristics, and random selection of questions. Ulas Bardak - Thesis Defense

  38. Less complex More complex Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation Four scenarios with 88 sessions • 10 rooms, 100 uncertain values • 20 rooms, 500 uncertain values • 50 rooms, 1000 uncertain values • 84 rooms, 3300 uncertain values Ulas Bardak - Thesis Defense

  39. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation For each setting, we use fivedifferent elicitation systems • Synergetic Elicitor • Heuristic & rule-based • Search & rule-based • Rule-based • Random Synergetic Elicitor Heuristic Elicitor Rule-based Elicitor Search Elicitor Ulas Bardak - Thesis Defense

  40. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation We plot: • Change in the schedule quality • Change in the quality loss dueto uncertainty (100%  0%) Ulas Bardak - Thesis Defense

  41. % of questions needed for 85% of full quality 80% 15% 33% 12.5% 70% Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation 100 variables y Ulas Bardak - Thesis Defense

  42. % of questions needed for 85% of full quality 45% 33% 26% 17.5% 44% Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation 3300 variables y Ulas Bardak - Thesis Defense

  43. % of questions needed for 95% of full quality 98% 73% 42% 18% 0% 25% 50% 75% 100% Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Evaluation 100 q. 500 q. 1000 q. Problem size 3400 q. Ulas Bardak - Thesis Defense

  44. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Summary • We have applied the elicitor to conference scheduling • Synergetic elicitor outperforms its components and simple heuristics • Improvement is more prominent for larger problems Ulas Bardak - Thesis Defense

  45. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Contributions We have investigated a novel approach to information elicitation, which has led to three main contributions. • Fast heuristic computation of the expected utility of potential questions • Use of B* search for determining more accurate question utilities • Synergy of domain-independent and domain-specific elicitation techniques Ulas Bardak - Thesis Defense

  46. Outline – Introduction – Related Work – Domain – Optimization – Elicitation - Evaluation - Conclusion Future work • Learning question costs • Learning elicitation strategies Ulas Bardak - Thesis Defense

  47. Ulas Bardak - Thesis Defense

  48. Additional Slides Ulas Bardak - Thesis Defense

  49. Vendor elicitation Domain • Sessions can require services that external vendors provide • e.g. mobile equipment, food deliveries • Each item can satisfy multiple services • e.g. Laptop  Computer, Portable computer • Penalty for spending money • A vendor optimizer finds a near optimal placement of vendor orders • Uncertainty can exist in prices, availability of items Ulas Bardak - Thesis Defense

  50. Vendor elicitation Elicitation Algorithm • Enumerate all of the services • Order based on affecting the overall cost penalty Ulas Bardak - Thesis Defense

More Related