1 / 24

artificial intelligence fdm 20c introduction to digital media lecture 26.01.2005

artificial intelligence fdm 20c introduction to digital media lecture 26.01.2005. warren sack / film & digital media department / university of california, santa cruz. last time. social networks as science social networks as technology social networks as popular culture

sarah
Download Presentation

artificial intelligence fdm 20c introduction to digital media lecture 26.01.2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. artificial intelligence fdm 20c introduction to digital media lecture 26.01.2005 warren sack / film & digital media department / university of california, santa cruz

  2. last time • social networks as science • social networks as technology • social networks as popular culture • social networks as art • mini-project 2: add yourself to friendster

  3. outline for today (1 of 2) • artificial intelligence: the founding document • who was turing? what is he famous for? • a reading of turing’s article “computing machinery and intelligence” in which the following is highlighted: • gender: the role of the woman in the “imitation game” • the aesthetics of the game: the aesthetics of the uncanny • the prescient insights of turing on gender and the body, that would turn out -- now -- to be most useful for trying to understanding online role-playing games and also some of the central weaknesses of decades of ai research (especially oversights made about the role of the body in models of thinking)

  4. outline (2 of 2) • a short history of artificial intelligence in software • planning as a technical problem • GPS as a “solution”: The General Problem Solver by Herbert Simon, Allen Newell, and Clifford • demo of GPS • story generation as a planning problem • TALESPIN as a “solution” • demo of micro-talespin • story understanding as a plan recognition problem • FRUMP as a “solution” • question answering as a problem • ELIZA as a “solution” • demo of ELIZA

  5. alan turing • Founder of computer science, artificial intelligence, mathematician, philosopher, codebreaker, and a gay man • see http://www.turing.org.uk/turing/

  6. alan turing (1912-1936) • 1912 (23 June): Birth, Paddington, London • 1926-31: Sherborne School • 1930: Death of friend Christopher Morcom • 1931-34: Undergraduate at King's College, Cambridge University • 1932-35: Quantum mechanics, probability, logic • 1935: Elected fellow of King's College, Cambridge • 1936: The Turing machine, computability, universal machine

  7. alan turing (1936-1946) • 1936-38: Princeton University. Ph.D. Logic, algebra, number theory • 1938-39: Return to Cambridge. Introduced to German Enigma cipher machine • 1939-40: The Bombe, machine for Enigma decryption • 1939-42: Breaking of U-boat Enigma, saving battle of the Atlantic • 1943-45: Chief Anglo-American crypto consultant. Electronic work. • 1945: National Physical Laboratory, London • 1946: Computer and software design leading the world.

  8. alan turing (1947-1954) • 1947-48: Programming, neural nets, and artificial intelligence • 1948: Manchester University • 1949: First serious mathematical use of a computer • 1950: The Turing Test for machine intelligence • 1951: Elected FRS. Non-linear theory of biological growth • 1952: Arrested as a homosexual, loss of security clearance • 1953-54: Unfinished work in biology and physics • 1954 (7 June): Death (suicide) by cyanide poisoning, Wilmslow, Cheshire.

  9. turing’s “imitation game” (1 of 3) • “The new form of the problem can be described in terms of a game which we call the ‘imitation game.’ It is played with three people, a man, a woman, and an interrogator who may be of either sex. The interrogator stays in a room apart from the other two. The object of the game for the interrogator is to determine which of the other two is the man and which is the woman.”

  10. turing’s “imitation game” (2 of 3) • “It is [the man's] object in the game to try and cause [the interrogator] to make the wrong identification.” • “The object of the game for [the woman] is to help the interrogator.”

  11. turing’s “imitation game” (3 of 3) • “We now ask the question, ‘What will happen when a machine takes the part of [the man] in this game?’ Will the interrogator decide wrongly as often when the game is played like this as he does when the game is played between a man and a woman? These questions replace our original [question], ‘Can machines think?’” (Turing, 1950, pp. 433-434)

  12. walker/sack/walker “online caroline” • walker: “My hair is still wet from the shower when I connect my computer to the network, sipping my morning coffee. I check my email and find it there in between other messages: an email from Caroline.” • sack (citing turing): “[The interrogator asks]: Will [you] please tell me the length of [your] hair?” • walker: “The first lines in my essay on Online Caroline really are striking in their insistence on a feminine imagery, ...”

  13. walker/sack/walker “online caroline” • walker: “The first lines in my essay on Online Caroline really are striking in their insistence on a feminine imagery, ...” and especially since the images I used (of wet hair and a shower) are so typical of the male objectifying gaze Sack refers to: imagine shampoo ads with half-naked women or the shower scene in Psycho. Why on earth did I choose such a way to ground my reading of Online Caroline?”

  14. walker/sack/walker “online caroline” • what is this virtual body evoked by turing and walker and “online caroline”? • do you have a gender when you are online?

  15. artificial intelligence: a definition “... artificial intelligence [AI] is the science of making machines do things that would require intelligence if done by [humans]” Marvin Minsky, 1963

  16. artificial intelligence: research areas • Knowledge Representation • Programming Languages • Natural Language (e.g., Story) Understanding • Speech Understanding • Vision • Robotics • Machine Learning • Expert Systems • Qualitative Simulation • Planning

  17. planning as a technical problem • GPS is what is known in AI as a “planner.” • Newell, Alan, Shaw, J. C., and Simon, Herbert A. “GPS, A Program ThatSimulates Human Thought.” InComputers and Thought, ed. Edward A.Feigenbaum and Julian Feldman. pp. 279-293. New York, 1963 • To work, GPS required that a full and accurate model of the “state of the world” (i.e., insofar as one can even talk of a “world” of logic or cryptoarthimetic, two of the domains in which GPS solved problems) be encoded and then updated after any action was taken (e.g., after a step was added to the proof of a theorem). • demo: implementation from Peter Norvig’s Paradigms of Artificial Intelligence Programming (see www.norvig.com)

  18. a problem with ai planning • the “frame problem”: This assumption – that perception was always accurate and that all of the significant details of the world could be modeled and followed – was incorporated into most AI programs for decades and resulted in what became known to the AI community as the “frame problem;” i.e., the problem of deciding what parts of the internal model to update when a change is made to the model or the external world. • Cf., Martins, J. “Belief Revision.” In Encyclopedia of Artificial Intelligence, Second Edition. Stuart C. Shapiro (editor-in-chief), pp. 110-116. New York, 1992

  19. story generation as planning • James Meehan, "The Metanovel: Writing Stories by Computer", Ph.D. diss., Yale University, 1976. • demo: micro-talespin • see http://web.media.mit.edu/~wsack/micro-talespin.txt

  20. problems with story generation: missing common sense • Examples of Talespin’s missing common sense(from Meehan, 1976) • Answers to questions can take more than one form. • Don’t always take answers literally. • You can notice things without being told about them. • Gravity is not a living creature. • Stories aren’t really stories if they don’t have a central problem. • Sometimes enough is enough. • Schizophrenia can be disfunctional.

  21. story understanding as a plan recognition problem G. DeJong (1979) FRUMP: Fast Reading Understanding and Memory Program $demonstration script • The demonstrators arrive at the demonstration location. • The demonstrators march. • Police arrive on the scene. • The demonstrators communicate with the target of the demonstration. • The demonstrators attack the target of the demonstration. • The demonstrators attack the police. (From DeJong, 1979; pp. 19-20)

  22. story understanding as plan recognition • demo: micro-sam • Richard Cullingford, “Script application: computer understanding of newspaper stories,” Ph.D. diss., Yale University, 1977.

  23. question answering as a problem • ELIZA as a “solution” • J. Weizenbaum, “ELIZA -- A Computer Program for the Study of Natural Language Communication between Man and Machine,” Communications of the Association for Computing Machinery, vol. 9, no. 1 (January 1965), pp. 36-45. • demo: see www.norvig.com for source code

  24. next time • human-computer interaction

More Related