150 likes | 276 Views
The Role of Semantic Roles in Disambiguating Verb Senses. Hoa Trang Dang and Martha Palmer 2005. Proceedings of the 43rd Annual Meeting of the ACL, pages 42–49.
E N D
The Role of Semantic Roles in Disambiguating Verb Senses • Hoa Trang Dang and Martha Palmer • 2005. Proceedings of the 43rd Annual Meeting of the ACL, pages 42–49.
Verbs are syntactically complex, and their syntax is thought to be determined by their underlying semantics (Grimshaw, 1990; Levin, 1993). • Disambiguation of verb senses can be further improved with better extraction of semantic roles.
Basic Automatic System • WordNet: word senses. • PropBank: semantic role labels. • Mallet: for learning maximum entropy models with Gaussian priors. • Senseval-2: the system was tested on thousands of the test instances of the 29 verbs from the English lexical sample task for Senseval-2.
Basic Automatic System • Topical features • Local features • Collocation features • Syntactic features • Semantic features
Topical Features • Topical features for a verb in a sentence look for the presence of keywords occurring anywhere in the sentence and any surrounding sentences provided as context. • The set of keywords is specific to each verb lemma to be disambiguated.
Local Features • Collocational features: • unigrams: words w-2, w-1, w0, w+1,w+2 • part-of-speech p-2, p-1, p0, p+1,p+2 • bigrams: w-2w-1,w-1w+1,w+1w+2; p-2p-1,p-1p+1p+1p+2 • trigrams: w-3w-2w-1,w-2w-1w+1,w-1w+1w+2,w+1w+2w+3; • p-3p-2p-1, p-2p-1p+1, p-1p+1p+2, p+1p+2p+3
Local Features • Syntactic features • Is the sentence passive? • Is there a subject, direct object, indirect object , or clausal complement? • What is the word (if any) that is the particle or head of the subject, direct object, or indirect object? • If there is a PP complement, what is the preposition, and what is the object of the preposition?
Local Features • Semantic features: • What is the Named Entity tag (PERSON, ORGANIZATION, LOCATION, UNKNOWN) for each proper noun in the syntactic positions above? • What are the possible WordNet synsets and hypernyms for each noun in the syntactic positions above?
Evaluation Accuracy of system on Senseval-2 verbs using topical features and different subsets of local features. co=collocational syn=syntactic sem=semantic This system: 62.5% accuracy. Lee and Ng, 2002: 61.1% accuracy.
Evaluation Accuracy of system on Senseval-2 verbs, using topical features and different subsets of semantic class features. ne=named entity tags wn=WordNet classes
PropBank • PropBank is a corpus in which verbs are annotated with semantic tags, including coarse-grained sense distinctions and predicate-argument structures. • Example: [ ARG0 Mr. Bush] has [rel called] [ ARG1-for for an agreement by next September at the latest]
Frameset Tagging Accuracy of system on frameset-tagging task for verbs with more than one frameset, using different types of local features. (pb=PropBank role features.) *The most frequent frameset gives a baseline accuracy of 76.0%.
WordNet Sense-tagging Accuracy of system on WordNet sense-tagging for instances in both Senseval-2 and PropBank, using different types of local features. *PropBank ARGM features are included.
Frameset tags for WordNet sense-tagging Accuracy of system on WordNet sensetagging of 20 Senseval-2 verbs with more than one frameset. orig=original local features.
Conclusion • Disambiguation for verbs can be improved through more accurate extraction of features representing information such as that contained in the framesets and predicate argument structures annotated in PropBank.