1 / 70

Learning the Structure of Task-Oriented Conversations from the Corpus of In-Domain Dialogs

Learning the Structure of Task-Oriented Conversations from the Corpus of In-Domain Dialogs . Ph.D. Thesis Defense Ananlada Chotimongkol Carnegie Mellon University, 18 th December 2007 Thesis Committee: Alexander Rudnicky (Chair) William Cohen Carolyn Penstein Rosé

liam
Download Presentation

Learning the Structure of Task-Oriented Conversations from the Corpus of In-Domain Dialogs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning the Structure of Task-Oriented Conversations from the Corpus of In-Domain Dialogs Ph.D. Thesis Defense Ananlada Chotimongkol Carnegie Mellon University, 18th December 2007 Thesis Committee: Alexander Rudnicky (Chair) William Cohen Carolyn Penstein Rosé Gokhan Tur (SRI International)

  2. Outline • Introduction • Structure of task-oriented conversations • Machine learning approaches • Conclusion

  3. problem| dialog structure | learning approaches | conclusion A spoken dialog system “When would you like to leave?” “I would like to fly to Seattle tomorrow.” Domain Knowledge tasks, steps, domain keywords Speech Recognizer Speech Synthesizer Natural Language Understanding Natural Language Generator Dialog Manager

  4. ? problem| dialog structure | learning approaches | conclusion Problems in acquiring domain knowledge Domain Knowledge (tasks, steps, domain keywords) • Problems: • Require domain expertise • Subjective • May miss some cases (Yankelovich, 1997) • Problems: • Require domain expertise • Subjective • May miss some cases • Time consuming(Bangalore et al., 2006) example dialogs

  5. step1: reserve a flight step2: reserve a car step3: reserve a hotel problem| dialog structure | learning approaches | conclusion • Observable structure • Reflect domain information • Observable -> learnable? Task-oriented dialog Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARY TWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ... Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  6. dialog system human revises problem| dialog structure | learning approaches | conclusion Proposed solution Domain Knowledge (tasks, steps, domain keywords) example dialogs My Thesis

  7. air travel dialogs Domain Knowledge task = create a travel itinerary steps = reserve a flight, reserve a hotel, reserve a car keywords = airline, city name, date problem| dialog structure | learning approaches | conclusion Learning system output

  8. problem| dialog structure | learning approaches | conclusion Thesis statement Investigate how to infer domain-specific information required to build a task-oriented dialog system from a corpus of in-domain conversations through an unsupervised learning approach

  9. problem| dialog structure | learning approaches | conclusion Thesis scope (1) • What to learn: domain-specific information in a task-oriented dialog • A list of tasks and their decompositions (travel reservation: flight, car, hotel) • Domain keywords (airline, city name, date) Investigate how to infer domain-specific informationrequired to build a task-oriented dialog system from a corpus of in-domain conversations through an unsupervised learning approach

  10. problem| dialog structure | learning approaches | conclusion Thesis scope (2) • Resources: a corpus of in-domain conversations • Recorded human-human conversations are already available Investigate how to infer domain-specific information required to build a task-oriented dialog system from a corpus of in-domain conversations through an unsupervised learning approach

  11. problem| dialog structure | learning approaches | conclusion Thesis scope (3) • Learning approach: unsupervised learning • No training data available for a new domain • Annotating data is time consuming Investigate how to infer domain-specific information required to build a task-oriented dialog system from a corpus of in-domain conversations through an unsupervised learning approach

  12. problem| dialog structure | learning approaches | conclusion Proposed approach Investigate how to infer domain-specific information required to build a task-oriented dialog system from a corpus of in-domain conversations through an unsupervised learning approach • 2 research problems • Specify a suitable domain-specific information representation • Develop a learning approach that infers domain information captured by this representation from human-human dialogs

  13. Outline • Introduction • Structure of task-oriented conversations • Properties of a suitable dialog structure • Form-based dialog structure representation • Evaluation • Machine learning approaches • Conclusion

  14. problem| dialog structure : properties| learning approaches | conclusion Properties of a desired dialog structure • Sufficiency • Capture all domain-specific information required to build a task-oriented dialog system • Generality (domain-independent) • Able to describe task-oriented dialogs in dissimilar domains and types • Learnability • Can be identified by an unsupervised machine learning algorithm

  15. problem| dialog structure : properties | learning approaches | conclusion Domain-specific informationin task-oriented dialogs • A list of tasks and their decompositions • Ex: travel reservation = flight + car + hotel • A compositional structure of a dialog based on the characteristics of a task • Domain keywords • Ex: airline, city name, date • The actual content of a dialog

  16. problem| dialog structure : properties | learning approaches | conclusion Existing discourse structures

  17. problem| dialog structure : form-based | learning approaches | conclusion Form-based dialog structure representation • Based on a notion of form (Ferrieux and Sadek, 1994) • A data representation used in the form-based dialog system architecture • Focus only on concrete information • Can be observed directly from in-domain conversations

  18. problem| dialog structure : form-based | learning approaches | conclusion Form-based representation components • Consists of 3 components • Task • Sub-task • Concept

  19. make a travel reservation Form-based representation components • Task • A subset of a dialog that has a specific goal Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARY TWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  20. reserve a flight reserve a car reserve a hotel Form-based representation components • Sub-task • A step in a task that contributes toward the goal • Contains sufficient information to execute a domain action Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARY TWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  21. Form-based representation components • Concept (domain keywords) • A piece of information required to perform an action Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  22. problem| dialog structure : form-based | learning approaches | conclusion Data representation • Represented by a form • A repository of related pieces of information necessary for performing an action

  23. reserve a flight • Form: flight query Data representation • Form = a repository of related pieces of information • Sub-task: contains sufficient information to execute a domain action a form Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  24. Form: flight query • Form: car query • Form: hotel query Data representation • Form = a repository of related pieces of information • Task: a subset of a dialog that has a specific goal a set of forms Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  25. Form: flight query • Form: flight query • DepartCity: Pittsburgh • ArriveCity: Houston • ArriveState: Texas • DepartDate: February twentieth Data representation • Form = a repository of related pieces of information • Concept: a piece of information required to perform an action  a slot Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  26. problem| dialog structure : form-based | learning approaches | conclusion Form-based representation properties • Sufficiency • The form is already used in a form-based dialog system • Philips train timetable system (Aust et al., 1995) • CMU Communicator system (Rudnicky et al., 1999) • Generality (domain-independent) • A broader interpretation of the form is provided • The analysis of six dissimilar domains • Learnability • Components are observable directly from a dialog • (by human) annotation scheme reliability • (by machine) the accuracy of the domain information learned by the proposed approaches

  27. Outline • Introduction • Structure of task-oriented conversations • Properties of a suitable dialog structure • Form-based dialog structure representation • Evaluation • Dialog structure analysis (generality) • Annotation experiment (human learnability) • Machine learning approaches • Conclusion

  28. problem| dialog structure : analysis | learning approaches | conclusion Dialog structure analysis • Goal: • To verify that form-based representation can be applied to dissimilar domains • Approach: • Analyze 6 task-oriented domains • Air travel planning (information-accessing task) • Bus schedule inquiry (information-accessing task) • Map reading (problem-solving task) • UAV flight simulation (command-and-control task) • Meeting (personnel resource management) • Tutoring (physics essay revising)

  29. Map reading domain route giver route follower

  30. problem| dialog structure : analysis | learning approaches | conclusion Map reading domain(problem-solving task) • Task: draw a route on a map • Sub-task: draw a segment of a route • Concepts: StartLocation = {White_Mountain, Machete, …} Direction = {down, left, …} Distance = {a couple of centimeters, an inch, …} • Sub-task: ground a landmark • Concepts: LandmarkName = {White_Mountain, Machete, …} Location = {below the start, …}

  31. Dialog structure analysis (map reading domain) • Form: grounding • LandmarkName: missionary camp • Location: below the start • Form: segment description • Start Location: start • Direction: left • Distance: an inch • Path: • End Location: GIVER1: okay ... ehm ... right, you have the start? FOLLOWER2: yeah. GIVER3: right, below the start do you have ... er like a missionary camp? FOLLOWER4: yeah. GIVER5: okay, well ... if you take it from the start just run ... horizontally. FOLLOWER6: uh-huh. GIVER7: eh to the left for about an inch. FOLLOWER8: right. GIVER9: and then go down along the side of the missionary camp. FOLLOWER10: uh-huh. GIVER11: 'til you're about an inch ... above the bottom of the map. FOLLOWER12: right. GIVER13: then you need to go straight along for about 'til about ... GIVER1: okay ... ehm ... right, you have the start? FOLLOWER2: yeah. (action: (implicit) define_a_landmark) GIVER3: right, below the start do you have ... er like a missionary camp? FOLLOWER4: yeah. (action: define_a_landmark) GIVER5: okay, well ... if you take it from the start just run ... horizontally. FOLLOWER6: uh-huh. GIVER7: eh to the left for about an inch. FOLLOWER8: right. (action: draw_a_segment) GIVER9: and then go down along the side of the missionary camp. FOLLOWER10: uh-huh. GIVER11: 'til you're about an inch ... above the bottom of the map. FOLLOWER12: right. GIVER13: then you need to go straight along for about 'til about ...

  32. problem| dialog structure : analysis | learning approaches | conclusion UAV flight simulation domain(command-and-control task) • Task: take photos of the targets • Sub-task: take a photo of each target • Sub-subtask: control a plane • Concepts: Altitude = {2700, 3300, …} Speed = {50 knots, 200 knots, …} Destination = {H-area, SSTE, …} • Sub-subtask: ground a landmark • Concepts: • LandmarkName = {H-area, SSTE, …} LandmarkType = {target, waypoint}

  33. problem| dialog structure : analysis | learning approaches | conclusion Meeting domain • Task: manage resources for a new employee • Sub-task: get a computer • Concepts: Type = {desktop, laptop, …} Brand = {IBM, Dell, …} • Sub-task: get office space • Sub-task: create an action item • Concepts: Description = {have a space, …} Person = {Hardware Expert, Building Expert, …} StartDate = {today, …} EndDate = {the fourteenth of december, …}

  34. problem| dialog structure : analysis | learning approaches | conclusion Characteristics of form-based representation • Focus only on concrete information • That is observable directly from in-domain conversations • Describe a dialog with a simple model • Pros: • Possible to be learned by an unsupervised learning approach • Cons: • Can’t capture information that is not clearly expressed in a dialog • Omitted concept values • Nevertheless, 93% of dialog content can be accounted for • Can’t model a complex dialog that has a dynamic structure • A tutoring domain • But it is good enough for many real world applications

  35. problem| dialog structure : analysis | learning approaches | conclusion Form-based representation properties(revisit) • Sufficiency • The form is already used in a form-based dialog system • Can account for 93% of dialog content • Generality (domain-independent) • A broader interpretation of the form representation is provided • Can represent 5 out of 6 disparate domains • Learnability • Components are observable directly from a dialog • (by human) annotation scheme reliability • (by machine) the accuracy of the domain information learned by the proposed approaches

  36. problem| dialog structure : annotation experiment | learning approaches | conclusion Annotation experiment • Goal • To verify thatthe form-based representation can be understood and applied by other annotators • Approach • Conduct an annotation experiment with non-expert annotators • Evaluation • Similarity between annotations • Accuracy of annotations

  37. problem| dialog structure : annotation experiment | learning approaches | conclusion Challenges in annotation comparison • Different tagsets may be used since annotators have to design theirs own tagsets • Some differences are acceptable if they conform to the guideline • Different dialog structure designs can generate dialog systems with the same functionalities

  38. original annotation (dialog A) corrected annotation (dialog A) Annotator 1 annotates tagset 1 direct comparison original annotation (dialog A) corrected annotation (dialog A) tagset 2 cross-annotator comparison cross-annotator comparison Annotator 2 correct Annotator 2 annotates Annotator 1 corrects Cross-annotator correction • Each annotator creates his/her own tagset and then annotate dialogs • Each annotator critiques and corrects another annotator’s work • Compare the original annotation with the corrected one

  39. problem| dialog structure : annotation experiment | learning approaches | conclusion Annotation experiment • 2 domains • Air travel planning domain (information-accessing task) • Map reading domain (problem-solving task) • 4 subjects in each domain • People who are likely to use the form-based representation in the future • Each subject has to • Design a tagset and annotate the structure of dialogs • Critique other subjects’ annotation on the same set of dialogs

  40. Evaluation metrics • Annotation similarity • Acceptability is the degree to which an original annotation is acceptable to a corrector • Annotation accuracy • Accuracy is the degree to which a subject’s annotation is acceptable to an expert

  41. problem| dialog structure : annotation experiment | learning approaches | conclusion Annotation results • High acceptability and accuracy • Except task/sub-task accuracy in map reading domain • Concepts can be annotated more reliably than tasks and sub-tasks • Smaller units • Have to be communicated clearly

  42. problem| dialog structure : annotation experiment | learning approaches | conclusion Form-based representation properties(revisit) • Sufficiency • The form is already used in a form-based dialog system • Can account for 93% of dialog content • Generality (domain-independent) • A broader interpretation of the form representation is provided • Can represent 5 out of 6 disparate domains • Learnability • Components are observable directly from a dialog • Can be applied reliably by other annotators in most of the cases • (by machine) the accuracy of the domain information learned by the proposed approaches

  43. Outline • Introduction • Structure of task-oriented conversations • Machine learning approaches • Conclusion

  44. problem| dialog structure| learning approaches | conclusion Overview of learning approaches • Divide into 2 sub-problems • Concept identification • What are the concepts? • What are their members? • Form identification • What are the forms? • What are the slots (concepts) in each form? • Use unsupervised learning approaches • Acquisition (not recognition) problem

  45. Form: flight query • DepartCity: Pittsburgh • ArriveCity: Houston • ArriveState: Texas • ArriveAirport: Intercontinental • Form: car query • Pick up location: Houston • Pickup Time: • Return Time: • Form: hotel query • City: Houston • Area: Downtown • HotelName: problem| dialog structure| learning approaches | conclusion Learning example Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARY TWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ... Client: I'D LIKE TO FLY TO HOUSTON TEXAS Agent : AND DEPARTING PITTSBURGH ON WHAT DATE ? Client: DEPARTING ON FEBRUARYTWENTIETH ... Agent : DO YOU NEED A CAR ? Client : YEAH Agent : THE LEAST EXPENSIVE RATE I HAVE WOULD BE WITH THRIFTY RENTAL CAR FOR TWENTY THREE NINETY A DAY Client : OKAY Agent : WOULD YOU LIKE ME TO BOOK THAT CAR FOR YOU ? Client : YES ... Agent : OKAY AND WOULD YOU NEED A HOTEL WHILE YOU'RE IN HOUSTON ? Client : YES Agent : AND WHERE AT IN HOUSTON ? Client : /UM/ DOWNTOWN Agent : OKAY Agent : DID YOU HAVE A HOTEL PREFERENCE ? ...

  46. Outline • Introduction • Structure of task-oriented conversations • Machine learning approaches • Concept identification • Form identification • Conclusion

  47. problem| dialog structure| learning approaches : concept identification | conclusion Concept identification • Goal: Identify domain concepts and their members • City={Pittsburgh, Boston, Austin, …} • Month={January, February, March, …} • Approach: word clustering algorithm • Identify concept words and group the similar ones into the same cluster

  48. problem| dialog structure| learning approaches : concept identification | conclusion Word clustering algorithms • Use word co-occurrences statistics • Mutual information (MI-based) • Kullback-Liebler distance (KL-based) • Iterative algorithms need a stopping criteria • Use information that is available during the clustering process • Mutual information (MI-based) • Distance between clusters (KL-based) • Number of clusters

  49. problem| dialog structure| learning approaches : concept identification | conclusion Clustering evaluation • Allow more than one cluster to represent a concept • To discover as many concept words as possible • However, the clustering result that doesn’t contain splited concepts is preferred • Quality score (QS) = harmonic mean of • Precision (purity) • Recall (completeness) • Singularity Score (SS) SS of conceptj =

  50. problem| dialog structure| learning approaches : concept identification | conclusion Concept clustering results • Domain concepts can be identified with acceptable accuracy • Example clusters • {GATWICK, CINCINNATI, PHILADELPHIA, L.A., ATLANTA} • {HERTZ, BUDGET, THRIFTY} • Low recall for infrequent concepts • An automatic stopping criterion yields close to optimal results

More Related