1 / 21

Notes for CS3310 Artificial Intelligence Part 2: Representation of facts Prof. Neil C. Rowe Naval Postgraduate School

Things we need to represent in the computer to achieve intelligent behavior. Facts (truths about the world), or an "ontology" -- XML is driving much development of ontologies today Knowledge only partially certain Definitions of concepts Rules for concluding things Advice ("heuristics") Proced

lafayette
Download Presentation

Notes for CS3310 Artificial Intelligence Part 2: Representation of facts Prof. Neil C. Rowe Naval Postgraduate School

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Notes for CS3310 Artificial Intelligence Part 2: Representation of facts Prof. Neil C. Rowe Naval Postgraduate School Version of January 2006

    2. Things we need to represent in the computer to achieve intelligent behavior Facts (truths about the world), or an "ontology” -- XML is driving much development of ontologies today Knowledge only partially certain Definitions of concepts Rules for concluding things Advice ("heuristics") Procedures for accomplishing things Learning methods

    3. Prolog fact representation vehicle(car). says "A car is a vehicle.” component(engine,car). says "An engine is a component of a car.” 1. Facts are represented as predicate expressions. 2. The word in front is the predicate name. 3. Words in parentheses are arguments. 4. Multiple arguments are separated by commas and parentheses. 5. The expression ends with a period. 6. This looks like first-order predicate calculus.

    4. Type predicates in Prolog notation ship(enterprise) is a type-predicate expression. It says the Enterprise is a ship. These have one argument only. Other examples: ship(carrier). vehicle(ship). professor(n_c_rowe). car_manufacturer(toyota). nps_class(cs3310). declaration_of_independence_signed(july_4_1776). Note the Prolog notation conventions: (1) lower case in facts; (2) facts end with period; (3) multi-word concepts use the underscore character. Type-predicate facts make a hierarchy, the type hierarchy.

    5. Property predicates The expression color(enterprise,gray). says the Enterprise has a color property whose value is gray. (Prefer this to: gray(enterprise).) These “property predicates” have two arguments of different types. Other examples: location(enterprise,norfolk). date(inauguration,jan_17). job(tom,professor).

    6. Relationship predicates west_of (enterprise,kennedy). says the Enterprise is west of the Kennedy. Both arguments have similar types. To determine the meaning, insert the predicate name between the two arguments. Other examples: part_of(engine,car) bosses(superintendent,provost). front_of(radiator,engine). earns_more(plumber,professor). before(loading,takeoff). a_kind_of(enterprise,ship). (or ako(enterprise,ship).)

    7. Zero-argument predicate expressions (flags) Zero-argument predicate expressions indicate boolean conditions. No parentheses should be used (unlike Java). Examples: male, engine_warmed_up, abnormal_loop_condition.

    8. Function predicates sum(2,3,5). gives the result of a computation. Last argument is always the result. These predicates can have any number of arguments except 0. Other examples: square(7,49). number_of_days(february,1992,29). longest_word(aa,aaaa,aa,a,aaa,aaaa). Usually function predicates are defined by procedures, not by facts.

    9. Relational-database predicates ship(enterprise,us,14n,35e,01feb85). says that the Enterprise is a ship of nationality US, and it was located at 14N 35E on February 1, 1985. A "schema" is needed in general to specify what each argument means. These predicates can have any number of arguments. Other examples: ship(kennedy,us,10n,20e,01feb85). aircraft(5592348,f-18,us,chinalake,01feb85).

    10. Time and uncertainty in facts To represent facts true only for a period of time: Use two more arguments, the starting time and the ending time (which can be equal) of the period in which the fact is true. Example: location(enterprise,norfolk,12sep85,08oct85). To represent facts not completely certain: Use one more argument, the probability that the fact is true. 1.0 means completely certain to be true, 0.0 means completely certain to be false. Usually the probability is the last argument. Example: disease(joe_smith,appendicitis,0.7).

    11. From English to Prolog predicate expressions Locate expressions that represent facts. Nouns and verbs often map to a-kind-of relationships. Adjectives and adverbs often map to properties. Prepositional, participial, and other kinds of phrases usually map to relationships but sometimes to properties. So "A Sidewinder is a heat-seeking air-to-air missile developed at NWC” is represented as: a_kind_of(sidewinder, missile). guidance(sidewinder, heat_seeking). deployment(sidewinder, air_to_air). developer(sidewinder, nwc)

    12. Different words can mean the same thing All of these should be represented as: a_kind_of(ship,vehicle): "A ship is a vehicle". "The ship is one kind of vehicle”. "Ships are vehicles.” All of these should be represented as: color(ship,gray): "A ship is gray.” "The ship is always gray.” "Ships are gray.” "They color ships gray.”

    13. Some subtleties of facts 1. Words can have different senses in English, so number them to be precise: a_kind_of(sidewinder-1,missile-1). a_kind_of(sidewinder-2,rattlesnake-1). The Wordnet system lists sense numbers for words.

    14. 2. For synonyms, use only the most common word in facts and map from the others. “Building" sense 1 means the same as "edifice” sense 1, so prefer the former. 3. Unnamed things are "anonymous constants”: a_kind_of(v1,sidewinder-1). on(v1,v2). a_kind_of(v2,pedestal-1). That says "a sidewinder is on a stand", but "all sidewinders are on all stands" is: on(sidewinder-1,pedestal-1).

    15. Representing actions Actions (verbs like "launches" and nouns like "explosion") need: 1. The action's related concepts and how they relate: (a) the type of action (use the a_kind_of predicate); (b) who does the action (use agent); (c) what the action is acts upon (use object); (d) tools of the action (use instrument_of, etc.) For instance, "The F-18 raked the target with fire”: a_kind_of(v1,f_18). a_kind_of(v2,raking_action). agent(v2,v1). object(v2,v3). a_kind_of(v3,target). instrument_of(v2,v4). a_kind_of(v4,fire).

    16. 2. The action's preconditions and effects. This is important to hypothetical reasoning, discussed later. It's especially important for verbs like "paint" which focus on results. Or for instance, "John handed Tom the blueprints" includes: a_kind_of(v1,giving_action). a_kind_of(v2,blueprints). precondition(v1,has(john,v2)). precondition(v1,holding(john,v2)). postcondition(v1,has(tom,v2)). postcondition(v1,holding(tom,v2)).

    17. Fact-representation exercise #1: "Posttest view of zeppo rocket separation test.”

    18. Exercise #2: "Crewmen fighting fire with flight deck and aircraft damage.”

    19. Semantic networks A visual representation of a set of two-argument facts as a directed graph. Draw an arrow from the node for the first argument to the node for the second argument, and label it with the predicate name. Example: owns(usn,enterprise). color(enterprise,gray). a_kind_of(enterprise,aircraft_carrier). part_of(flight_deck,aircraft_carrier).

    20. Example semantic network

    21. Reductionism: A key issue = How much you simplify the world to model it in the computer. If you simplify too much, your model can't be trusted. Example: ISAAC system modeling land combat: Positions are cells on a rectangular grid; agents represent soldiers. Agents have a set of objectives: e.g. obey orders, avoid getting hurt, occupy high ground. Each has an associated direction to move. Weights are assigned to objectives, functions of circumstances. For agent to decide what to do, it adds up weighted directions and moves in that direction. But does a weighted average really model how soldiers move in combat?

More Related