1 / 28

COMP 4200: Expert Systems

COMP 4200: Expert Systems. Dr. Christel Kemke Department of Computer Science University of Manitoba. Reasoning in Expert Systems. knowledge representation in Expert Systems shallow and deep reasoning forward and backward reasoning alternative inference methods metaknowledge.

kevyn
Download Presentation

COMP 4200: Expert Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba

  2. Reasoning in Expert Systems • knowledge representation in Expert Systems • shallow and deep reasoning • forward and backward reasoning • alternative inference methods • metaknowledge

  3. Expert performance depends on expert knowledge! Experts and Expert Systems • Human Experts achieve high performance because of extensive knowledge concerning their field • Generally developed over many years

  4. Types of Knowledge Knowledge Representationin XPScan include: • conceptual knowledge • terminology, domain-specific terms • derivative knowledge • conclusions between facts • causal connections • causal model of domain • procedural knowledge • guidelines for actions

  5. Knowledge Modeling in XPS Knowledge Modeling Technique in XPS • mostly rule-based systems (RBS) • rule system models elements of knowledge formulated independently as rules • rule set is easy to expand • often only limited ‘deep’ knowledge, i.e. no explicit coherent causal or functional model of the domain

  6. Shallow and Deep Reasoning • shallow reasoning • also called “experiential reasoning” • aims at describing aspects of the world heuristically • short inference chains • complex rules • deep reasoning • also called causal reasoning • aims at building a model that behaves like the “real thing” • long inference chains • simple rules that describe cause and effect relationships

  7. Dilbert on Reasoning 1

  8. Dilbert on Reasoning 2

  9. Dilbert on Reasoning 3

  10. General Technology of XPS Knowledge + Inference • core of XPS • Most often Rule-Based Systems (RBS) • other forms: Neural Networks, Case-Based Reasoning

  11. Rule-Based Expert Systems Work with • a set of facts describing the current world state • a set of rules describing the expert knowledge • inference mechanisms for combining facts and rules in reasoning

  12. Inference Engine Knowledge Base (rules) Working Memory (facts) Agenda Explanation Facility Knowledge Acquisition Facility User Interface

  13. Architecture of Rule-Based XPS 1 Knowledge-Base / Rule-Base • stores expert knowledge as “condition-action-rules” (or: if-then- or premise-consequence-rules) • objects or frame structures are often used to represent concepts in the domain of expertise, e.g. “club” in the golf domain. Working Memory • stores initial facts and generated facts derived by the inference engine • additional parameters like the “degree of trust” in the truth of a fact or a rule ( certainty factors) or probabilistic measurements can be added

  14. Architecture of Rule-Based XPS 2 Inference Engine • matches condition-part of rules against facts stored in Working Memory (pattern matching); • rules with satisfied condition are active rulesand areplaced on the agenda; • among the active rules on the agenda, one is selected (see conflict resolution, priorities of rules) as next rule for • execution (“firing”) – consequence of rule can add new facts to Working Memory, modify facts, retract facts, and more

  15. Architecture of Rule-Based XPS 3 Inference Engine + additional components might be necessary for other functions, like • calculation of certainty values, • determination ofpriorities of rules • and conflict resolutionmechanisms, • a truth maintenance system(TMS) if reasoningwith defaultsandbeliefs is requested

  16. Rule-Based Systems- Example ‘Grades’ - Rules to determine ‘grade’ • study  good_grade • not_study  bad_grade • sun_shines  go_out • go_out  not_study • stay_home  study • awful_weather  stay_home

  17. Example ‘Grades’ Rule-Base to determine the ‘grade’: • study  good_grade • not_study  bad_grade • sun_shines  go_out • go_out  not_study • stay_home  study • awful_weather  stay_home Q1: If the weather is awful, do you get a good or bad grade? Q2: When do you get a good grade?

  18. Forward and Backward Reasoning forward reasoning • Facts are given. What is the conclusion? A set of known facts is given (in WM); apply rules to derive new facts as conclusions (forward chaining of rules) until you come up with a requested final goal fact. backward reasoning • Hypothesis (goal) is given. Is it supported by facts? A hypothesis (goal fact) is given; try to derive it based on a set of given initial facts using sub-goals (backward chaining of rules) until goal is grounded in initial facts.

  19. Example ‘Grades’ • study  good_grade • not_study  bad_grade • sun_shines  go_out • go_out  not_study • stay_home  study • awful_weather  stay_home forward reasoningrule chain given fact: awful_weather 6,5,1 backward reasoning hypothesis/goal: good_grade 1,5,6

  20. Example ‘Grades’ – Reasoning Tree good grade bad grade study not study stay home go out awful weather sun shines

  21. Example – Grades Working MemoryAgenda awful weather Rule 6 Select and apply Rule 6 awful weather stay home Rule 5 Select and applyRule 5

  22. Example – Grades Working MemoryAgenda awful weather stay home study Rule 1 Select and applyRule 1 awful weather stay home study good grade empty DONE!

  23. Example ‘Police’ – Reasoning Tree forward reasoning: Shield AND Pistol  Police backward reasoning: Police  Badge AND gun Police Bad Boy Badge AND Gun OR Revolver Pistol Shield Q: What if only ‘Gun’ is known?

  24. Example ‘Police’ – Reasoning Tree Police Bad Boy Badge AND Gun OR Revolver Pistol Shield Q: What if only ‘Pistol’ is known as ground fact?

  25. Example ‘Police’ – Reasoning Tree Bad Boy Police Badge AND Gun OR Shield Revolver Pistol Task: Write down the Rule-Base for this example!

  26. Forward vs. Backward Chaining

  27. Alternative Reasoning Methods • Theorem Proving • emphasis on mathematical proofs and correctness, not so much on performance and ease of use • Probabilistic Reasoning • integrates probabilities into the reasoning process • Certainty Factors • Express subjective assessment of truth of fact or rule • Fuzzy Reasoning • allows the use of vaguely defined predicates and rules

  28. Metaknowledge • deals with “knowledge about knowledge” • e.g. reasoning about properties of knowledge representation schemes, or inference mechanisms • usually relies on higher order logic • in (first order) predicate logic, quantifiers are applied to variables • second-order predicate logic allows the use of quantifiers for function and predicate symbols • may result in substantial performance problems • CLIPS uses meta-knowledge to define itself, i.e. CLIPS constructs, classes, etc. - in a bootstrapping form

More Related