1 / 46

Spring 2011 Artificial Intelligence COSC 40503

Antonio Sanchez Texas Christian University. Spring 2011 Artificial Intelligence COSC 40503. The IF THEN contingency model.

corin
Download Presentation

Spring 2011 Artificial Intelligence COSC 40503

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Antonio Sanchez Texas Christian University Spring 2011Artificial Intelligence COSC 40503

  2. The IF THEN contingency model In the early 1970 Ed Feigenbaum at Stanford began working with a simple model to codify knowledge, a model so simple that people were initially skeptical about the results however they were so successful that the first true AI application in industry took place. Expert Systems made AI known outside of the academic field. Basically the idea behind the model is to represent the relation between two pieces of data as an Implication IF Antecedent THEN Consequent A = >C(here the concept of implication is the Key)

  3. WITH P Learning Consequent RHS = Consequent LHS LHS = Antecedent The IF THEN equivalents IF Antecedent THEN Consequent A = > C

  4. Expert System = Knowledge Base + Inference Engine A computer program that can advise, analyze, categorize, communicate, consult, design, diagnose, explain, explore, forecast, form concepts, identify, interpret, justify, learn, manage, monitor, plan, present, retrieve, schedule, test, and tutor. They address problems normally thought to require human specialists for their solution. (Michaelson, Michie, & Boulanger 1985)

  5. Expert Systems Components • User interface • Knowledge acquisition moduleKnowledge base • Inference engine • control strategy • Explanation facility • Diverse Interfaces

  6. Expert System Architecture Queries A good I/O interface with the user DB Data BaseGeneral data (facts) is stored here Results Diagnosis,synthesis Why? & How? Inference Search KD and DB to solve a problem backward and forward tracking Interface Syntactical and semantical processing Knowledge Aquisition New rules IF/THEN DBA KB Administration Login users and journaling, metadata Knowledge Base IF THEN rules are stored here Conditions Initial data conditions (facts)

  7. Operation There are basically four different processes when running an Expert System • Data Base Updating • Knowledge Acquisition • Queries • Why and How Questioning

  8. DB DBA KB Data Base Updating Data Base • The DB is updated with the • data of facts and conditions for • a given situation Data, conditions & facts

  9. Knowledge Acquisition From Buchanan et al. (1983): “the transfer and transformation of potential problem solving expertise from some knowledge source to a program”

  10. DB DBA KB • The KB is enhanced with new rules • based on the expertise of various • the Experts in the field • Rules are organized in certain fashion Knowledge Acquisition New IF/THEN Rues Knowledge Base

  11. Backward chaining (DIAGNOSIS) • Begins with a proposed conclusion • Tries to match it with the “then”clauses of rules • Then looks at the corresponding “if”clauses • Tries to match those with assertions, or with the “then”clauses of other rules Source:www.csc.uvic.ca/~csc212/lec05/chapter14.ppt

  12. Forward chainingSYNTHESIS • Begins with assertions and tries to match those assertions to “if”clauses of rules, thereby generating new assertions

  13. DB DBA KB Querying Queries Results Data Base Inference • In generates an hypothesis • Tries to prove it either • Backwards (diagnosis) • Forwards (synthesis) Knowledge Base

  14. DB DBA KB Why and How questions Why? & How? Results Data Base Inference • To answer WHY: • It presents the proposed • tree traversal (forward) • To answer HOW: • It presents the performed tree • traversal so far executed (backward) Knowledge Base

  15. Diagnosis Possible explanation Synthesis Possible construction 2 3 4 5 6 7 Rule exploring: Node Levels LHS = Antecedents = Premises 1 RHS = Consequents = Conclusions

  16. On Synthesis • How many ATMs should their • be available? • It all depends… On what? • The Location • The Income • The type of Neighborhood • The Season, the Day, the Time • On Diagnosis • Why are there 2 ATMs available • on Friday in the Morning? • Well because… Because of what? • The Location • The Income • The type of Neighborhood • The Season, the Day, the Time • Here we have information & knowledge However there are more knowledge, consider for example: How much money? How much security? In what order, are the rules sorted? The ATM Example Information & Knowledge What is the rationality in rule 67, 81 or 95? How about adding more rules like 69, 83, 97, …..?

  17. By Knowledge area • By Node Level • By Synonym • By Complexity • By Number of Antecedents • By Use Search Strategiesin the KB LJournal Reference

  18. Trend Setters • DENDRAL • MYCIN with EMYCIN &TEIRESIAS • INTERNIST • PROSPECTOR • R1/XCON

  19. Trend Setters • DENDRAL (Feigenbaum, Buchanan, Letterberg) • First expert system: Analyzed NMR mass spectrogram data to determine the geometric arrangement of atoms in a molecule. • It is in routine use by chemists, and has contributed to refereed journal publications.

  20. Trend Setters • MYCIN (Ted Shortliffe) • Knew about blood infectionsミ • In one study, its recommendations were judged preferrable or equal to those of five experts. • INTERNIST (Harry Pople) • Broader medical expertise • In one study, it got 25 out of 43 diagnoses correct, compared to 28 for clinical physicians and 35 for experts.

  21. Trend Setters • MYCIN (Ted Shortliffe) • Knew about blood infections • In one study, its recommendations were judged preferable or equal to those of five experts. • Sample Rule: • IF: • (1) the stain of the organism is gram-positive, • AND (2) the morphology of the organism is coccus, • AND (3) the growth confirmation of the organism is clumps, • THEN: • there is suggestive evidence (0.7) that the identity of the organism is staphylococcus.

  22. Trend Setters • PROSPECTOR (Hart and Duda) • It analyzed information from geological explorations. • It accurately identified the location and extent of ore grade mineralization for a previously unknown molybdenum deposit. • R1 (XCON) (John McDermott) • It was routinely used to configure every VAX sold by DEC. • Over 99% of the configurations were reported to be accurate, and most of the rest usable with minor corrections. • Most errors were reportedly due to lack of product information on recent products. • Famous because it was early large system --somewhere around 6,000 to 10,000 rules.

  23. From 1990 On • There were over 5,000 expert systems in existence • $2 billion a year business

  24. Source: Peter R. Gillett, Rutgers University Classification and Applications • Management • Advice on management by objectives • Selection and use of forecasting techniques • Analysis of failing companies • Scheduling of business trips and meetings • Human resources • Matching personnel to jobs • Arranging compensation packages • Finance and banking • Stock portfolio management • Asset-liability management • Loan approvals and auditing • Diagnosis • hypothesizing a cause of a problem given some data points • Debugging and Repair • analyzing malfunctions and making recommendations • Interpretation • form high-level conclusions from collections of raw data. • Monitoring • comparing a system's observed behavior to its expected behavior • Control • governing the behavior of a complex environment • Design • finding configurations that meet some performance criteria or constraints • Planning • Instruction • detecting and correcting deficiencies in students understanding of a subject. • Production • Fault diagnosis in networks and equipment • Complex bidding in the construction industry • Accounting and auditing • Estate planning and tax advice • Executing and analyzing internal auditing • Charging back costs in computer time-sharing • Auditing advanced EDP systems • Audit program development • Internal control evaluation • Risk analysis • Tax accrual and deferral • Disclosure compliance • Computers and Information Systems • Data center evaluation • Selection and maintenance of HS • Software development • Software selection • Information transfer

  25. Expert System Details • Knowledge Acquisition • Knowledge Representation • Inference Resolution • Uncertainty • Limitations • Ontologies

  26. Knowledge Engineering • Knowledge acquisition • Knowledge elicitation • Knowledge representation • Production rules • Semantic networks • Frames

  27. Production Rules • Dominant paradigm for applications • Especially where textbook knowledge or heuristics are applied, can appear a very natural representation • Can pose problems when the number of rules grows excessively large • A method for resolving rule conflicts is needed

  28. Example of Rules PINEL [Palacios, Sanchez] (1985) Psychological Disorders [ DIAGNOSTICS ] Rule 1 Category: Schizophrenia IF There is incoherence ( 1/3, 1 ) AND Absence of systematic delirium ( 1/3, 1 ) AND Inappropriate affect ( 1/3, 1 ) THEN Disorganized Schizophrenia Rule 6 Category: Schizophrenia IF Lack of hygienic appearance ( 1/3, 1 ) AND Difficulties at job ( 1/3, 1 ) AND Difficulties in establishing a relationship ( 1/3, 1 ) THEN Schizophrenia Criteria B SEA [Vazquez, Flammand, Sanchez, 1984] Agricultural expert System [ SYNTHESIS] Rule 14 IF % of clay Xr AND % of limus Xl AND % of sand < Xs THEN texture = Xt Rule 30 IF PHKCl >= 4 AND PHKCl <= 5 THEN quantity = 306.86-59.4*PHKCl*%clay [in Kg/Ha of CaO]

  29. Example of Rules ZYANYA [Rodriguez, Sanchez] (1992) EDP Auditing [ SYNTHESIS] R000 IF The computer is located in a physically safe place AND The entrance has a secure access control AND Only authorized personnel with access codes are allowed to use the system THEN The physical security of the computer is probably adequate R033 IF [ ( Feasibility analysis was performed ) AND ( There is a quantifiable product) AND ( NOT {there is a control element of the C,M,S type} ) C058 THEN ( It is observed that the execution of the study has contradicted the nature of the C,M,S element ) R058 IF ( At least the programming phase was executed ) C058 THEN ( Proceed to evaluate the coherence of the methodology applied in the organization ) N001 ACTIVATE NODE ( Coherence of DLC methodology ) SEBT [Gracian] (1985) Conveyor Design [ SYNTHESIS] Rule 3 Category: Layers IF 14 <= Width < 20 AND Angle = 20 THEN Consider 4 layers Rule 50 Category: Diameters IF 14 <= Width < 20 AND FPM <= 300 THEN Diameter = 4

  30. Example of Rules WWTPS [Cabezut, Sanchez] (1995) Waste Water Treatment Plants [ DIAGNOSTICS ] IF (disposer and channel between Lagoon 4 & Zahuapán River is dirty) THEN inadequate flow of between lagoon 4 and Zahuapán IF ( Lagoon 4 is calcium filled AND disposer of Lagoon 4 is clean) THEN shoulder strap water < = 2.60 m, AND adequate load for Lagoon 4 IF (L4EA AND L4AA AND TAA AND PHA AND EPL AND L3RADBO) THENAdequate elimination of the biochemical Oxygen demand in Lagoon 4 AND L4RADBO = (1- (DBO3-DBO4)/DBO0)*100 > = 10 %

  31. When to use ES { // First Rule {For SYNTHESIS: When there IS NOT a Function {y = f(x)} or analytic model that represents it in the problem does not exists OR For DIAGNOSIS: When there IS NOT an inverse Function {x = f(y)} or analytic model that represents it in the problem does not exists} ANDThe task be clearly definable ANDA methodology has been delineated and knowledge may be elicited and can be structured ANDThere is interdisciplinary group of experts in the area ANDWhen the task be not trivial} Then you use Expert Systems

  32. When to use uncertainty { // Second Rule The context universe of the application is UNKNOWN OR There is little knowledge OR There is lack of data AND it is not easily obtainable OR the data available is statistical in nature OR the context universe of the application appears UNKNOWN due to the lack of IF/THEN rules } Then you use Uncertainty

  33. Limitations • Difficulties in identifying suitable human experts for development • Difficulties in eliciting expertise from humans, who may have problems in articulating their expertise • Disagreements among experts • Consultations time-consuming relative to perceived value

  34. Inference Engines • Control Strategies • Forward chaining • Backward chaining • Search strategies • Depth first • Breadth first • Conflict resolution • RETE algorithm

  35. Inference under Uncertainty • Unreliable sources of information • Abundance of irrelevant data • Imprecision of language and perception • Lack of understanding • Hidden or unknown variables • Data difficult or expensive to obtain

  36. Reasoning and Uncertainty • Probability Theory • Bayesian Networks • Dempster-Shafer Theory • Certainty Factors • Approximate Reasoning • Fuzzy Logic • Sources of Uncertainty and Inexactness in Reasoning • Incorrect and Incomplete Knowledge • Ambiguities • Belief and Disbelief

  37. Bayesian Approaches • Use a Bayesian Network by deriving the probability of a cause given a symptom • Specially useful in diagnostic systems • medicine, computer help systems • inverse or a posteriori probability • inverse to conditional probability of an earlier event given that a later one occurred

  38. Advantages and Problems of Bayesian Reasoning • Advantages • sound theoretical foundation • well-defined semantics for decision making • Problems • requires large amounts of probability data • sufficient sample sizes • subjective evidence may not be reliable • independence of evidences assumption often not valid • relationship between hypothesis and evidence is reduced to a number • explanations for the user difficult • high computational overhead

  39. Dempster-Shafer Theory • Mathematical theory of evidence • notations • frame of discernment FD • power set of the set of possible conclusions • mass probability function m • assigns a value from [0,1] to every item in the frame of discernment • mass probability m(A) • portion of the total mass probability that is assigned to an element A of FD

  40. Belief and Certainty • Belief Bel(A) in a subset A • sum of the mass probabilities of all the proper subsets of A • likelihood that one of its members is the conclusion • Plausibility Pl(A) • maximum belief of A • Certainty Cer(A) • interval [Bel(A), Pl(A)] • expresses the range of belief Source: Dr. Franz J. Kurfess www.csc.calpoly.edu/~fkurfess/ Courses/480/F03/Slides/

  41. Advantages and Problems of Dempster-Shafer • Advantages • clear, rigorous foundation • ability oto express confidence through intervals • certainty about certainty • Problems • non-intuitive determination of mass probability • very high computational overhead • may produce counterintuitive results due to normalization • usability somewhat unclear Source: Dr. Franz J. Kurfess www.csc.calpoly.edu/~fkurfess/ Courses/480/F03/Slides/

  42. Certainty Factors • Shares some foundations with Dempster-Shafer theory, but more practical • Denotes the belief in a hypothesis H given that some pieces of evidence are observed • No statements about the belief is no evidence is present • in contrast to Bayes’ method Source: Dr. Franz J. Kurfess www.csc.calpoly.edu/~fkurfess/ Courses/480/F03/Slides/

  43. Belief and Disbelief • measure of belief • degree to which hypothesis H is supported by evidence E • MB(H,E) = 1 IF P(H) =1 (P(H|E) - P(H)) / (1- P(H)) otherwise • measure of disbelief • degree to which doubt in hypothesis H is supported by evidence E • MB(H,E) = 1 IF P(H) =0 (P(H) - P(H|E)) / P(H)) otherwise

  44. Certainty Factor • Certainty factor CF • ranges between -1 (denial of the hypothesis H) and 1 (confirmation of H) • CF = (MB - MD) / (1 - min (MD, MB) Example in Pinel IF There is incoherence ( MD = 1/3, MB = 1 ) AND Absence of systematic delirium ( MD = 1/3, MB = 1 ) AND Inappropriate affect ( MD = 1/3, MB = 1 ) THEN ( IF Min(MB) - Max(MD) > Threshold (0.2) ==> Disorganized Schizophrenia )

  45. Source: http://www.expertise2go.com/webesie/tutorials/ESGloss.htm Certainty Factor

  46. Expert System Tools • Support prototyping • Shells • High-level programming languages • Multiple-paradigm programming environments (e.g., KEE, ART,CLIPS,JESS)

More Related