960 likes | 1.65k Views
Computers and Humor by Don L. F. Nilsen and Alleen Pace Nilsen High Learning Curve IT Support How Computers Have Affected the World Bill Gates Computer Generated Humor: Apple’s Joke Teller Given the command, “Computer, tell me a joke,” this is one response: COMPUTER: Knock, knock.
E N D
Computers and Humor by Don L. F. Nilsen and Alleen Pace Nilsen 71
IT Support 71
Bill Gates 71
Computer Generated Humor:Apple’s Joke Teller • Given the command, “Computer, tell me a joke,” this is one response: • COMPUTER: Knock, knock. • YOU: Who’s there. • COMPUTER: Thistle. • YOU: Thistle who? • COMPUTER: “Thistle be my last knock knock joke. (Hemplemann, 333) 71
Natural Language Processing:Suspension of Disbelief • General Principle: “If your system can’t do natural language, force the user to use your version of an artificial language and make it feel like natural language as much as necessary” (Hempelmann 335). 71
Computers with a Sense of Humor • Kim Binstead says that humor can help “make clarification queries less repetitive, statements of ignorance more acceptable, and error messages less patronizing.” (Hempelmann 336) • John Morkes et. al. demonstrate that computer systems that employ humor are viewed as “more likable and competent” (Morkes 215). 71
FACS: Facial Action Coding System • “Based on an anatomical analysis of facial action, FACS describes facial expressions and movements and in a second step relates them to emotions.” • FACS distinguishes between different types of smiles and laughs by using such parameters as frequency, intensity, duration, and symmetry. • Paul Ekman and Wallace Friesen are using the FACS to build gestural facial and bodily expressions into computer programs. • FACS has also been used by the movie industry in such films as Shrek and Toy Story. (Hempelmann 337) 71
JAPE: Joke Analysis and Production Engine • Kim Binstead and Graeme Ritchie are using the JAPE system to generate humor. • However, “JAPE’s joke analysis and production engine is merely a punning riddle generator. It is not “generative” in Noam Chomsky’s sense of the word. (Hempelmann 337) 71
A JAPE Joke • JAPE would use information like the following to produce this joke: • (i) “cereal” IS-A “breakfast food” • (ii) “murderer” IS-A “killer” • (iii) “cereal” SOUNDS-LIKE “serial” • (iv) “serial klller” is a meaningful phrase • Q: What do you get when you cross a breakfast food with a murderer? • A: A cereal killer. (Hempelmann 338) 71
STANDUP: Interactive Riddle Builder • STANDUP has a larger resource size than JAPE. • STANDUP is designed to help children with language problems stay on task. • Children use the STANDUP program to produce riddles, and the humor in the program keeps the children interested and active. • But STANDUP has basically the same level of computer sophistication as does JAPE. (Hempelmann 340) 71
How to Make a Computer Laugh:Computer Recognition of One-Liners • Rada Mihalcea, Stephen Pulman and Carlo Strapparava are looking for correspondences between the surface structure and the text meanings to see which ones correlate with humorous and non-humorous texts. (Hempelmann 340) 71
Humorous SignalsHuman Centeredness & Polarity Orientation • The expressions that correlate with humor can be categorized as: • Human-Centric Vocabulary (pronouns…) • Negative Evaluations (“wrong,” “error”…) • Professional Communities (“lawyers,” “programmers”…) • Negative Traits (“ignorance,” “lying”…) (Hempelmann 340) 71
Fuzzy Logic • Hans Wim Tinholt and Anton Nijholt are working with “fuzzy logic” and “anaphoric ambiguity” to investigate sentences like, “The cops arrested the demonstrators because they were violent.” • Identifying the ambiguity is relatively easy, but deciding which ambiguity is humorous is much more difficult. (Hempelmann 341) 71
EIGENTASTE JESTER • Eigentaste is a “constant time collaborative filtering algorithm.” • Dhruv Gupta, Mark Digiovanni, Hiro Narita, and Ken Goldberg are adapting Eigentaste into JESTER, which is a system that can actually evaluate the jokes in a large database. (Hempelmann 341). 71
GTVH: General Theory of Verbal HumorLIBJOG: Lightbulb-Joke Generator • Victor Raskin and Salvatore Attardo are using a modification of GTVH called LIBJOG to produce light-bulb jokes. The authors are aware that their humor generator has “zero intelligence.” • “In fact, the main thrust of LIBJOG was to expose the inadequacy of such systems (as JAPE) and to emphasize the need to integrate fully formalized large-scale knowledge resources in a scalable model of computational humor.” (Hempelmann 338) 71
SSTH: Semantic Script Theory of Humorand the HAHAcronym Generator • The HAHAcronym Generator is loosely based on Raskin and Attardo’s SSTH. • “Using WordNet Domains, like Medicine or Linguistics, antonymy relations between the domains, like Religion vs. Technology, as well as several other supporting resources, they create funny interpretations for acronyms.” • “MIT becomes “Mythical Institute of Theology.” (Hempelmann 339) 71
SSTH: Semantic Script Theory of Humor: • SSTH shows script overlap and script oppositeness. • “But when the theory is quoted, exclusive attention is usually paid to script opposition, while overlap is, at the most, quietly understood to be involved.” (Hempelmann 342) 71
SSTH and Ontological Semantics • For the Semantic Script Theory of Humor to be really effective, it must include ontological semantics. • But ontological semantics needs to systematically deal with the information found in dictionaries, encyclopedias, thesauruses, and many other types of reference books. (Hempelmann 347) 71
Using Ontological Semantics to Generate a Joke • In his “Computational Humor: Beyond the Pun?” Christian Hempelmann gives seven pages of rigorous and systematic details to generate the following joke: • Q: What did the egg say in the monastery? • A: Out of the frying pan, into the friar. 71
Joke vs. Wordplay • For people who fail to see the overlap in a joke, it isn’t a joke at all. It is merely word play. • “Given that humans are desperately good disambiguators with vast semantic networks available to them, as well as excellent pragmatic interpreters, we seek any kind of semantic overlap to be able to handle the phonological (quasi-)ambiguity as humor, even if mere wordplay was intended.” (Hempelmann 346) 71
Klangspiel: Play with Sounds, vs.Sinnspiel: Play with Meanings • “What adds to the confusion is that non-humorous wordplay, like rhyming, can be enjoyed aesthetically, and this enjoyment can be confused with the enjoyment derived from humor.” • “The belief on the part of a joker that he or she can get away with pure ‘Klangspiel’ is what earns bad puns (i.e. groaners) a pariah status in the family of jokes.” (Hempelmann 346). 71
Ynperfect Pun Selector • In an article entitled, “Ynperfect Pun Selector for Computational Humor,” Christian Hempelmann gives the following joke: • A. Knock knock. B. Who’s there? • A. Cantaloupe. B. Cantaloupe who? • A. Can’t elope tonight—Dad’s got the car. • Hempelmann also considered bilingual punning, as in, “Those who jump off a Paris bridge are in Seine” (Hempelmann 342-343). 71
Willing Suspension of Disbelief in A Joke • Samuel Coleridge said that the two key elements of poetry are “a human interest and a semblance of truth sufficient to procure for these shadows of imagination that willing suspension of disbelief for the moment, which constitutes poetic faith.” • Hempelmann considers a joke, as an aesthetic text, to be a specific type of poetry. But the joke also requires opposition and incongruity. • Willing suspension of disbelief is required “to reconcile this incongruity and at least playfully, make it spuriously appropriate.” • Note that this same willing suspension of disbelief is required in religion and in magic (Hempelmann 344-345). 71
BOTTOM-UP AND TOP-DOWN PROCESSING • Bottom-up processing relates to decoding. You start with the actual sounds, letters, morphemes, etc. and figure out the words, phrases, clauses, sentences, paragraphs, etc. • Top-down processing is based on reasoning. You make a generalization and see how well the sounds, letters, morphemes, etc. support your generalization. (Fromkin Rodman Hyams 369) 71
Top-down reasoning is powerful, but it can be dangerous if it is not accompanied by bottom-up reasoning. • For example, Otto Jesperson assumed that men were better thinkers than women. • He conducted an experiment in which men and women read a story and were given a quiz. 71
The women responded more quickly and more accurately than the men, which was not what Jacobson had expected. • So he concluded that women’s minds have “vacant chambers” that men’s minds don’t have. • This allowed Jacobson to account for his evidence while at the same time not disproving his original hypothesis that men were better thinkers than women. 71
Boolean Algebra • Christie Davies says Boolean algebra “enables users to hide problems and assumptions behind algebraic symbols. • You can not easily turn words into numbers. • Those who try to do so usually do not understand either.” (Davies [2008]: 178) 71
COMPUTER WORDS AND METAPHORS • COMPUTER WORDS: bits, bytes, code police, cyberspace, future shock, hackers, hard copy, menu, third wave, user-friendly • COMPUTER METAPHORS: 42, bug, cookies, GIGO, great runes, heavy wizardry, Lotus Software, Melvyl (California Library System), a sagan, snail mail, Sorcerer’s Apprentice mode, Trojan Horse, USLT, vulcan nerve pinch, web, YABA compatible 71
CONCORDANCE • A Concordance checks for word frequency and word associations. • Word associations are determined by providing a window of three words (more or less) on each side of the targeted word. (Fromkin Rodman Hyams 379) 71
ELIZA • ELIZA is a program designed to pass the turing test. If a person communicating with ELIZA cannot tell whether or not ELIZA is a human, then ELIZA passes the turing test. • ELIZA claims to be a psychiatrist talking to a patient. ELIZA, like a psychiatrist, is attempting to get rather than give information: 71
PATIENT: Men are all alike. • ELIZA: In what way? • PATIENT: They’re always bugging me about something or other. • ELIZA: Can you think of a specific example? • PATIENT: Well, my boyfriend made me come here. • ELIZA: Your boyfriend made you come here? • PATIENT: He says I’m depressed much of the time. • ELIZA: I am sorry to hear that you are depressed. (Fromkin Rodman Hyams 389) 71
Victor Raskin fooled ELIZA by saying, “I don’t like my mother and I don’t want to talk about her.” • ELIZA responded, “Tell me more about your mother. (Nilsen Nilsen & Combs 199) 71
CIRCUIT FIX-IT SHOP • Twenty-five years after ELIZA, a program named “Circuit Fix-It Shop” was developed: • COMPUTER: This is the circuit fix-it shop. How may I help you? • TECHNICIAN: I need to fix the RS111 circuit. • COMPUTER: I am familiar with that circuit. The LED is supposed to be displaying alternatively blinking one and seven. • TECHNICIAN: Where is the LED? 71
COMPUTER: In the middle right of the blue region is the LED. • TECHNICIAN: No light. • COMPUTER: What is the switch at when the LED is off? • TECHNICIAN: The switch is down. • COMPUTER: You put up the switch. (Fromkin Rodman Hyams 390) 71
MORE SOPHISTICATED PROGRAMS • Today more sophisticated programs are needed. One such program is the little Paperclip guy that answers questions in Microsoft Word. • Another sophisticated program is “Script Model Grammar” designed by Roger Schank and Robert Abelson and modified by linguist Victor Raskin and others at Purdue University and elsewhere. 71
SAM: SCRIPT APPLIER MECHANISM • Of course sentences need to be parsed in Artificial Intelligence. But constituents larger than a sentence must be parsed as well. • One of the devices for doing this discourse parsing is the “Script Applier Mechanism.” 71
Note that a play or a movie has a script for the actors to follow. • The script in Artificial Intelligence is the same, but it is much simpler. It is a “mundane script.” • The “Restaurant Script,” for example involves a customer, a server, a cashier, etc. 71
Props in the “Restaurant Script” include the restaurant, the table, the menu, the food, the check, the payment, the tip, etc. The sequence of actions is as follows: 1. Customer goes to restaurant. 2. Customer goes to table. 3. Server brings menu. 4. Customer orders food. 5. Server brings food. 6. Customer eats food. 7. Server brings check. 8. Customer leaves tip for server. 9. Customer gives payment to cashier. 10. Customer leaves restaurant. (Hendrix and Sacerdote 654) (Nilsen Nilsen & Combs 199) 71
There are two exciting things about the Script Applier Mechanism. First, it is able to spot anything that is missing, added, or out of place in the sequence of events and ask, “What’s up.” • Second, it is able to handle two scripts at the same time, so that it is capable of dealing with jokes, language play, satire, irony, sarcasm, parody, paradox and double entendre in general. 71
PARSING PROBLEMS • GARDEN PATH: • The horse raced past the barn fell. • After the child visited the doctor prescribed a course of injections. • The doctor said the patient will die yesterday. • EMBEDDING: “Never imagine yourself not to be otherwise than what it might appear to others…to be otherwise.” • (Lewis Carroll’s Alice’s Adventures in Wonderland) (Fromkin Rodman Hyams 365, 373) 71
RIGHT-BRANCHING VS. EMBEDDING • RIGHT BRANCHING: This is the dog that worried the cat that killed the rat that ate the malt that lay in the house that Jack built. • EMBEDDING: Jack built the house that the malt that the rat that the cat that the dog worried killed ate lay in. • NOTE Multiple embedding is OK for a computer, but not OK for the human brain. (Fromkin Rodman Hyams 373-374) 71
ANOMALOUS WORDS: A sniggle blick is procking a slar. • METANALYSIS (incorrect phrase breaking): • grade A vs. grey day • night rate vs. nitrate (Fromkin Rodman Hyams 368, 370) • NOTE: English “adder” and “apron” were borrowed incorrectly from the French expressions “un nadder” and “un naperon” respectively 71
AMBIGUOUS SYNTAX IN NEWSPAPER HEADLINES: • Teacher Strikes Idle Kids • Enraged Cow Injures Farmer with Ax • Killer Sentenced to Die for Second Time in 10 Years • Stolen Painting Found by Tree (Fromkin Rodman Hyams 372) 71
REAL-WORLD KNOWLEDGE • Explain why the following sentences are ambiguous to a computer but not to a human: • A cheesecake was on the table. It was delicious and was soon eaten. • SIGN IN A CHURCH: For those of you who have children and don’t know it, we have a nursery downstairs. • NEWSPAPER AD: Our bikinis are exciting; they are simply the tops. (Fromkin Rodman Hyams 403) 71
ANTISMOKING CAMPAIGN SLOGAN: It’s time we make smoking history. • Do you know the time? • Concerned with spreading violence, the president called a press conference. • The ladies of the church have cast off clothing of every kind and they may be seen in the church basement Friday. (Fromkin Rodman Hyams 403) 71
AMBIGUOUS NEWSPAPER HEADLINES • Red Tape Holds Up New Bridge • Kids Make Nutritious Snacks • Sex Education Delayed, Teachers Request Training (Fromkin Rodman Hyams 403) 71
SEMANTIC PRIMING • In the human brain, the word “doctor” is more easily and more completely processed if it is preceded by “nurse” than if it is preceded by “flower.” • This is because “doctor” and “nurse” “are located in the same part of the mental lexicon.” (Fromkin Rodman Hyams 371) • This same feature could easily be built into Artificial Intelligence. 71