1 / 37

Concurrent Reasoning with Inference Graphs

Explore the rise of multi-core computers and concurrent natural deduction systems for application in natural language understanding, specifically for terrorist plot detection. Learn about inference graphs, propositional graphs, and rule node inference.

melanieh
Download Presentation

Concurrent Reasoning with Inference Graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concurrent Reasoning with Inference Graphs Department of Computer Science and Engineering University at Buffalo, The State University of New York Buffalo, New York, USA <drschleg,shapiro>@buffalo.edu Daniel R. Schlegel and Stuart C. Shapiro D. R. Schlegel and S. C. Shapiro

  2. Problem Statement • Rise of multi-core computers • Lack of concurrent natural deduction systems • Application to natural language understanding for terrorist plot detection. A Motivation D. R. Schlegel

  3. What are Inference Graphs? • Graphs for natural deduction • Four types of inference: • Forward • Backward • Bi-directional • Focused • Retain derived formulas for later re-use. • Propagate disbelief. • Built upon Propositional Graphs. • Take advantage of multi-core computers • Concurrency and scheduling • Near-linear speedup. D. R. Schlegel

  4. Propositional Graphs • Directed acyclic graph • Every well-formed expression is a node • Individual constants • Functional terms • Atomic formulas • Non-atomic formulas (“rules”) • Each node has an identifier, either • Symbol, or • wfti[!] • No two nodes with same identifier. D. R. Schlegel and S. C. Shapiro

  5. Propositional Graphs a wft1! c and-ant cq and-ant b If a, and b are true, then c is true. D. R. Schlegel and S. C. Shapiro

  6. Inference Graphs • Extend Propositional Graphs • Add channels for information flow (messages): • i-channelsreport truth of an antecedent to a rule node. • u-channelsreport truth of a consequent from a rule node. • Channels contain valves. • Hold messages back, or allow them through. u-channel i-channel a wft1! c and-ant cq and-ant b D. R. Schlegel and S. C. Shapiro

  7. Messages • Five kinds • I-INFER – “I’ve been inferred” • U-INFER – “You’ve been inferred” • BACKWARD-INFER – “Open valves so I might be inferred” • CANCEL-INFER – “Stop trying to infer me (close valves)” • UNASSERT – “I’m no longer believed” D. R. Schlegel and S. C. Shapiro

  8. Priorities • Messages have priorities. • UNASSERT is top priority • CANCEL-INFER is next • I-INFER/U-INFER are higher priority closer to a result • BACKWARD-INFER is lowest D. R. Schlegel and S. C. Shapiro

  9. Rule Node Inference • Message arrives at node. Assume we have a KB with a ^ b -> c, and b. Then a is asserted with forward inference. i-infer a : true a! wft1! c and-ant cq and-ant b! A message is sent from a to wft1 D. R. Schlegel and S. C. Shapiro

  10. Rule Node Inference 2. Message is translated to Rule Use Information 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents a : true a! wft1! c and-ant cq and-ant b! Rule Use Information is stored in rule nodes to be combined later with others that arrive. D. R. Schlegel and S. C. Shapiro

  11. Rule Node Inference 3. Combine RUIs with any existing ones 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents 1 Positive Antecedent, b 0 Negative Antecedents 2 Total Antecedents 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents + = a! wft1! c and-ant cq and-ant b! Combine the RUI for a with the one which already exists in wft1 for b. D. R. Schlegel and S. C. Shapiro

  12. Rule Node Inference 4. Determine if the rule can fire. 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents a! wft1! c and-ant cq and-ant b! We have two positive antecedents, and we need two. The rule can fire. D. R. Schlegel and S. C. Shapiro

  13. Rule Node Inference 5. Send out new messages. u-infer c : true a! wft1! c and-ant cq and-ant b! c will receive the message, and assert itself. D. R. Schlegel

  14. Cycles • Graphs may contain cycles. • No rule node will infer on the same message more than once. • RUIs with no new information are ignored. • Already open valves can’t be opened again. wft1! cq ant a b ant cq wft2! D. R. Schlegel and S. C. Shapiro

  15. Concurrency and Scheduling • Inference Segment: the area between two valves. • When messages reach a valve: • A task is created with the same priority as the message. • Task: application of the segment’s function to the message. • Task is added to a prioritized queue. • Tasks have minimal shared state, easing concurrency. D. R. Schlegel and S. C. Shapiro

  16. Concurrency and Scheduling • A task only operates within a single segment. • tasks for relaying newly derived information using segments “later” in the derivation are executed before “earlier” ones, and • once a node is known to be true or false, all tasks attempting to derive it are canceled, as long as their results are not needed elsewhere. D. R. Schlegel and S. C. Shapiro

  17. Example cq Backchain on cq. Assume every node requires a single one of its incoming nodes to be true for it to be true(simplified for easy viewing). Two processors will be used. D. R. Schlegel and S. C. Shapiro

  18. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  19. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  20. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  21. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  22. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  23. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  24. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  25. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  26. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  27. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  28. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  29. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  30. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  31. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  32. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  33. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  34. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  35. Example cq Backward Inference (Open valve) Inferred Inferring Cancelled D. R. Schlegel and S. C. Shapiro

  36. Evaluation • Concurrency: • Near linear performance improvement with the number of processors • Performance robustto graph depth and branching factor changes. • Scheduling Heuristics: • Backward-inference with or-entailment shows 10x improvement over LIFO queues, and 20-40x over FIFO queues. D. R. Schlegel and S. C. Shapiro

  37. Acknowledgements This work has been supported by a Multidisciplinary University Research Initiative (MURI) grant (Number W911NF-09- 1-0392) for Unified Research on Network-based Hard/Soft Information Fusion, issued by the US Army Research Office (ARO) under the program management of Dr. John Lavery. D. R. Schlegel

More Related