1 / 26

I1.2 A Quality-of-Information Theory for Sensor Data Collection and Fusion

I1.2 A Quality-of-Information Theory for Sensor Data Collection and Fusion. Abdelzaher (UIUC) . This Talk: Towards a QoI Theory for Data Fusion from Sensors + Information network links. Fusion of text and images. Fusion of soft sources. Fusion from human sources. Methods:

odetta
Download Presentation

I1.2 A Quality-of-Information Theory for Sensor Data Collection and Fusion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. I1.2 A Quality-of-Information Theory for Sensor Data Collection and Fusion Abdelzaher (UIUC)

  2. This Talk: Towards a QoI Theory for Data Fusion from Sensors + Information network links Fusion of text and images Fusion of soft sources Fusion from human sources • Methods: • Transfer knowledge • CCM • etc. Fusion of hard sources • Methods: • Ranking • Clustering • etc. • Methods: • Fact-finding • Influence analysis • etc. • Methods: • Bayesian analysis • Maximum likelihood Estimation • etc. Information Network Analysis Machine Learning Trust, Social Networks Signal data fusion Sensors, reports, and human sources

  3. Sensor FusionExample: Target Classification • Different sensors (of known reliability, false alarm rates, etc) are used to classify targets • Well-developed theory exists to combine possibly conflicting sensor measurements to accurately estimate target attributes. • Bayesian analysis • Maximum likelihood • Kalman filters • etc. Vibration sensors Infrared motion sensor Target Acoustic sensors

  4. Information Network MiningExample: Fact-finding Han • Example 1: • Consider a graph of who published where (but no prior knowledge of these individuals and conferences) • Rank conferences and authors by importance in their field KDD Roth Fusion WWW Abdelzaher Sensys • Example 2: • Consider a graph of who said what (sources and assertions butno prior knowledge of their credibility) • Rank sources and assertions by credibility John Claim1 Mike Claim2 Claim3 Sally Claim4

  5. The Challenge • How to combine information from sensors and information network links to offer a rigorous quantification of QoI (e.g., correctness probability) with minimal prior knowledge? P(armed convoy)=? Vibration sensors Infrared motion sensor Target John Claim1 + Mike Claim2 Claim3 Sally Claim4 Acoustic sensors

  6. Applications • Understand Civil Unrest • Remote situation assessment • Use Twitter feeds, news, cameras, … • Expedite Disaster Recovery • Damage assessment and first response • Use sensor feeds, eye witness reports, … • Reduce Traffic Congestion • Maping traffic congestion in city • Use crowd-sourcing (of cell-phone GPS measurements), speed sensor readings, eye witness reports, …

  7. Approach: Back to the Basics • Interpret the simplest fact-finder as a classical (Bayesian) sensor fusion problem • Identify the duality between information link analysis and Bayesian sensor fusion (links = sensor readings) • Use that duality to quantify probability of correctness of fusion (i.e., information link analysis) results • Incrementally extend analysis to more complex information network models and mining algorithms

  8. An Interdisciplinary Team • Abdelzaher (QoI, sensor fusion) • Roth (fact-finders, machine learning) • Aggarwal, Han (Data mining, veracity analysis) Fusion Task I1.1 QoI Mining Task I3.1 QoI Task I1.2

  9. The Bayesian Interpretation John • The Simplest Fact-finder: Claim1 Mike Claim2 Claim3 Sally Claim4 • The Simplest Bayesian Classifier (Naïve Bayesian):

  10. The Equivalence Condition • We know that for a sufficiently small xk: • Consider individually unreliable sensors:

  11. A Bayesian Fact-finder • By duality, if: • Then, Bayes Theorem eventually leads to: • and:

  12. Fusion of Sensors and Information Networks Information Network Source1 Sensor2 Sensor1 • Putting fusion of sensors and information network link analysis on a common analytic foundation: • Can quantify probability of correctness of results • Can leverage existing theory to derive accuracy bounds Claim1 Sensor3 Source2 Claim3 Claim2 Source3 Fusion Result Claim4

  13. Fusion of Sensors and Information Networks Information Network Source1 Sensor2 Sensor1 • Putting fusion of sensors and information network link analysis on a common analytic foundation: • Can quantify probability of correctness of results • Can leverage existing theory to derive accuracy bounds Claim1 Sensor3 Source2 Claim3 Claim2 Measurements Source3 Fusion Result Claim4 Measurements

  14. Simulation-based Evaluation • Generate thousands of “assertions” (some true, some false – unknown to the fact-finder) • Generate tens of sources (each source has a different probability of being correct – unknown to the fact-finder) • Sources make true/false assertions consistently with their probability of correctness • A link is created between each source and each assertion it makes • Analyze the resulting network to determine: • The set of true and false assertions • The probability that a source is correct • No prior knowledge of individual sources and assertions is assumed

  15. Evaluation ResultsComparison to 4 fact-finders from literature • Significantly improved prediction accuracy of source correctness probability (from 20% error to 4% error)

  16. Evaluation ResultsComparison to 4 fact-finders from literature • (Almost) no false positives for larger networks (> 30 sources)

  17. Evaluation ResultsComparison to 4 fact-finders from literature • Below 1% false negatives for larger networks (> 30 sources)

  18. Abdelzaher, Adali, Han, Huang, Roth, Szymanski Coming up: The Apollo FactFinder • Apollo: Improves fusion QoI from noisy human and sensor data. • Demo in IPSN 2011 (in April) • Collects data fromcell-phones • Interfaced to twitter • Can use sensors and human text • Analysis on several data sets: what really happened? Apollo Architecture Apollo: Towards Factfinding in Participatory Sensing, H. Khac Le, J. Pasternack, H. Ahmadi, M. Gupta, Y. Sun, T. Abdelzaher, J. Han, D. Roth, B. Szymanski, and S. Adali, demo session at ISPN10, The 10th International Conference on Information Processing in Sensor Networks, April, 2011, Chicago, IL, USA.

  19. Apollo Datasets Track data from cell-phones in a controlled experiment Tweets on Japan Earthquake, Tsunami and Nuclear Emergency 2 Million tweets from Egypt Unrest

  20. Immediate Extensions • Non-independent sources • Sources that have a common bias, sources where one influences another, etc. • Collaboration opportunities with SCNARC and Trust • Non-independent claims • Claims that cannot be simultaneously true • Claims that increase or decrease each other’s probability • Mixture of reliable and unreliable sources • More reliable sources can help calibrate correctness of less reliable sources

  21. Road Ahead Develop a unifying QoI-assurance theory for fact-finding/fusion from hard and soft sources • Sources • Use different media: signals, text, images, … • Feature differ authors: physical sensors, humans • Capabilities • Computes accurate best estimates of probabilities of correctness • Computes accurate confidence bounds in results • Enhances QoI/cost trade-offs in data fusion systems • Integrates sensor and information network link analysis into a unified analytic framework for QoI assessment • Accounts for data dependencies, constraints, context and prior knowledge • Account for effect of social factors such as trust, influence, and homophily on opinion formation, propagation, and perception (in human sensing) • Impact: Enhanced warfighter ability to assess information

  22. Collaborations QoI/cost analysis (unified theory for estimation/prediction and information network link analysis QoI Mining Task I3.1 Fusion Task I1.1 (w/Jiawei Han) Consider new link analysis algorithms (w/Dan Roth) Account for prior knowledge and constraints QoI Task I1.2 Community Modeling S2.2 (w/AylinYener) Increase OICC Decisions under Stress S3.1 OICC Task C1.2 (w/Boleslaw Szymanski and SibelAdali) Model humans in the loop Sister QoI Task C1.1 (w/RameshGovindan) Improve communication resource efficiency

  23. Collaborations Collaborative – Multi-institution: • Q2 (UIUC+IBM): TarekAbdelzaher, Dong Wang, HosseinAhmadi, Jeff Pasternack, Dan Roth, OmidFetemieh, and Hieu Le, CharuAggarwal, “On Bayesian Interpretation of Fact-finding in Information Networks,” submitted to Fusion 2011 Collaborative – Inter-center: • Q2 (I+SC): H. Khac Le, J. Pasternack, H. Ahmadi, M. Gupta, Y. Sun, T. Abdelzaher, J. Han, D. Roth, B. Szymanski, S. Adali, “Apollo: Towards Factfinding in Participatory Sensing,” IPSN Demo, April 2011 • Q2 (I+SC): Mani Srivastava, TarekAbdelzaher, Boleslaw Szymanski, “Human-centric Sensing,” Philosophical Transactions of the Royal Society, special issue on Wireless Sensor Networks, expected in 2011 (invited). Invited Session on QoI at Fusion 2011 (co-chaired with RameshGovindan, CNARC)

  24. Military Relevance • Enhanced warfighter decision-making ability based on better quality assessment of fusion outputs • A unified QoI assurance theory for fusion systems that utilize both sensors and information networks • Offers a quantitative understanding of the benefits of exploiting information network links in data fusion • Enhances result accuracy and provides confidence bounds in result correctness

More Related