1 / 63

Pramod K. Varshney Distinguished Professor, EECS

Distributed Inference in the Presence of Byzantines. Pramod K. Varshney Distinguished Professor, EECS Director of CASE: Center for Advanced Systems and Engineering Syracuse University E-mail: varshney@syr.edu. in collaboration with K. Agrawal, P. Anand , A. Rawat

mcclintock
Download Presentation

Pramod K. Varshney Distinguished Professor, EECS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Inference in the Presence of Byzantines Pramod K. Varshney Distinguished Professor, EECS Director of CASE: Center for Advanced Systems and Engineering Syracuse University E-mail: varshney@syr.edu in collaboration with K. Agrawal, P. Anand, A. Rawat B. Kailkhura, V. S. S. Nadendla, A. Vempaty S. K. Brahma , H. Chen , Y. S. Han, O. Ozdemir.

  2. Signal Processing, Communications & Control@ EECS Department, Syr. Univ. Pramod K. Varshney Mustafa CenkGursoy Yingbin Liang Biao Chen Jian Tang Senem Velipasalar MakanFardad Wenliang Du

  3. Sensor Fusion Lab

  4. Current Topics of Interest • Distributed Inference • Detection, Estimation, Classification, Tracking • Fusion for Heterogeneous Sensor Networks • Modeling (Dependent sensors using copula theory) • Sensor Management (Traditional/Game-theoretic designs) • Compressed Inference • Stochastic Resonance Applications • Cognitive Radio Networks • Security for Spectrum Sensing • Spectrum Auctions • Reliable Crowdsourcing • Ecological monitoring • Acoustic monitoring of wildlife in national forest reserves • Medical Image Processing

  5. Outline • Distributed Inference and Data Fusion • Byzantine Attacks • Distributed Inference with Byzantines • Ongoing Research and Future Work

  6. Distributed Inference in Practice

  7. Different Sensors, Diverse Information VIDEO THZ IMAGING ACOUSTIC SEISMIC THROUGH-THE-WALL

  8. Multi-sensor Inference: Information Fusion Typical decision making processes involve combining information from various sources Designing an automatic system to do this is a challenging task Many benefits from such a system

  9. Six Blind Men and an Elephant It was six men of IndostanTo learning much inclined,Who went to see the Elephant(Though all of them were blind),That each by observationMight satisfy his mind.The First approached the Elephant,And happening to fallAgainst his broad and sturdy side,At once began to bawl:"God bless me! but the ElephantIs very like a wall!"…… And so these men of IndostanDisputed loud and long,Each in his own opinionExceeding stiff and strong,Though each was partly in the right,And all were in the wrong! - John Godfrey Saxe

  10. Phenomenon y2 y3 y1 yN S-1 S-2 S-3 S-N u1 u2 u3 uN Fusion Center u0 Inference Network • Sensors collect raw-observations and transmit processed-observations to the fusion center. • Fusion center makes global inferences based on the sensor messages. • Inferences: Detection, Estimation, Classification. ...

  11. Typical Inference Problems and Applications • Detection • Example: Spectrum Sensing in Cognitive Radio Networks • Estimation • Example: State Estimation in Smart Grids Fusion Center . . . . . . . . . . Secondary Users (SUs) Primary User (PU)

  12. Phenomenon Phenomenon y2 y2 y1 y1 yN yN y3 y3 S-1 S-1 S-2 S-2 S-3 S-3 S-N S-N u1 u1 u2 u2 u3 u3 uN uN Fusion Center u0 Centralized vs. Distributed Inference • Distributed Inference • Distributed processing • Decision rules, both at the local sensors and at the fusion center, are based on system wide joint optimization • Centralized Inference • All the sensor signals are assumed to be available in one place for processing • Each detector acts independently and bases its decision on likelihood ratio test (LRT) . . . . . . . . . . . . . . . Decision 1 Decision N Local Decision N Local Decision 1 Global Decision

  13. u1 Data fusion center Local Sensor i u2 yi u0 ui ... uN Distributed Detection System Design Local Sensors Fusion Center • Consider binary quantizers at the local sensors. • Requires the design of local detectors and the fusion rule jointly • according to some optimization criterion. • NP-hard, in general.

  14. 0, if detector i decides H0 0, if H0 is decided ui= 1, if detector i decides H1 1, otherwise u0 = Design of Decision Rules The crux of the distributed hypothesis testing problem is to derive decision rules of the form Local decision rule can be defined by the conditional probability distribution function P(ui=1|yi ) and at the fusion center: u0 Fusion rule at the FC: logical function with N binary inputs and one binary output Number of fusion rules: 22N

  15. Input Output u 0 u u f f f f f f f f f f f f f f f f 1 2 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 0 1 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 1 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 Possible Fusion Rules for Two Binary

  16. Decision Criteria • In the binary hypothesis testing problem, we know that either H0 or H1 is true. Each time the experiment is conducted, one of the following can happen:

  17. Types of Errors in Detection

  18. Approaches for Signal Detection • Bayesian Framework • Neyman-Pearson Framework

  19. Phenomenon y2 y1 yN y3 S-1 S-2 S-3 S-N u1 u2 u3 uN Fusion Center u0 Bayesian Framework • Let . • Bayes Risk: Total average cost of making decisions where, • Then, the optimal fusion rule is given by the MAP (maximum a posteriori probability) rule

  20. Bayesian Framework (cont…) • Let C10 = C01 = 1, C11 = C00 = 0. • Then, for a given set of sensor quantizers, MAP rule can be simplified as follows. • For identical sensors, the above fusion rule simplifies to a “K out of N” rule. • If {y1, … , yN} is conditionally independent, then the optimal sensor decision-rules are likelihood-ratio tests.

  21. Neyman-Pearson Framework • Maximize Probability of Detection under Constrained Probability of False alarm • Under the conditional independence assumption, the optimal local sensor decision rules are likelihood ratio based tests. • The optimal fusion rule is again likelihood ratio test •  is chosen such that max PD s.t. PF

  22. Asymptotic Results • It has been shown that the use of identical thresholds is asymptotically optimal. • Asymptotic performance measure: • N-P Setup: Kullback-Leiblerdistance (KLD) • Bayesian Setup: Chernoff Information

  23. Asymptotic Results (Cont.) • Stein’s Lemma: If u is a random vector having N statistically independent and identically distributed components, under both hypotheses, the optimal (likelihood ratio) detector results in error probability that obeys the asymptotics

  24. Security Threats on Distributed Inference • Attacks are of following types: • Threats from External Sources • Threats from Within Can impact multiple layers simultaneously [Burbank, 2008].

  25. Byzantine Attacks • Malicious nodes – attack from within. • Byzantines send false information to the fusion center (FC). • Impact on performance of • PHY (Spectrum Sensing) • MAC (Spectrum Management & Handoff) • NET (Power Control & Routing) A. Vempaty, L. Tong, and P. K. Varshney, "Distributed Inference with Byzantine Data: State-of-the-Art Review on Data Falsification Attacks," IEEE Signal Process. Mag., vol. 30, no. 5, pp. 65-75, Sept. 2013

  26. Distributed Detection with Byzantines(Parallel Topology) • Distributed detection in the Presence of Byzantine Attacks: An Instance of Distributed Inference A. S. Rawat, P. Anand, H. Chen, P. K. Varshney, “Collaborative Spectrum Sensing in the Presence of Byzantine Attacks in Cognitive Radio Networks”, IEEE Trans. Signal Process., Vol. 59, No. 2, pp. 774-786, Feb 2011. A. Vempaty, K. Agrawal, H. Chen, and P. K. Varshney, "Adaptive learning of Byzantines' behavior in cooperative spectrum sensing," in Proc. IEEE Wireless Comm. and Networking Conf. (WCNC), Cancun, Mexico, Mar. 2011, pp. 1310-1315.

  27. System Model • Let N be the number of nodes • Fraction of Byzantine attackers is α • Nodes decide about the presence of primary transmitter and send ‘one bit’ decision (u) to the FC • Operating points of nodes: Byzantines’: Honest node’s : Fig. CRN: An example of such detection problems

  28. Performance Metrics • Let vi be the true local node decisions and be the local messages transmitted to the FC. • FC receives z = [, … ,],For error free channels, • Performance Metric = KL distance, • Byzantines minimize KLD by flipping their local decisions

  29. Binary Hypothesis Testing at the FC H0 : Signal is absent H1 : Signal is Present • Byzantines degrade the performance by flipping their local decisions using flipping probabilities

  30. Questions to be investigated… • What is the minimum fraction of Byzantine nodes, αblind, required to blind the FC? • What are the optimal flipping probabilities at the Byzantines to cause maximal damage to the performance at FC? • If modeled as a minimax game, what are the Nash equilibrium strategies? • How can we mitigate the impact of the Byzantine nodes? • Given a mitigation scheme, how will the Byzantine node behave if it does not want to be detected?

  31. Distributed Detection in the Presence of Byzantine Attacks • If the Byzantine attacks are independent of each other, αb = 0.5 (PDB = PDH, PFB = PFH); and if they cooperate, αb < 0.5. What is the minimum fraction, αb of Byzantines to totally blind the fusion center?

  32. Minimax Game Formulation • Sensors/FC are built upon an intelligent platform with the capability of changing their parameters • Byzantines would try to choose their threshold in such a way that it results in the maximum damage no matter what strategy the FC chooses. • FC chooses the parameters in such a way that it minimizes the worst case damage by the Byzantines no matter what strategy they choose. Zero Sum Game The best strategy for both the players is to operate at the saddle point.

  33. Minimax Approach

  34. Performance Metric: KLD

  35. Reputation-based Byzantine Identification and Removal from the Fusion Rule Reputation index, ni defined for each CR i over a time window T, as follows: If this index reaches a thresholdη, then the fusion center does not consider the node’s decisions in the subsequent stages of fusion.

  36. Adaptive Distributed Detection by Learning Byzantines’ Behavior • Identify Byzantines, learn their behavior, and use this information to improve the global performance. • Three-tier system • Local processing of data at each node for transmission to the FC. • Byzantine identification and estimation of their parameters at the FC. • Adaptive fusion rule. Note: Byzantine Parameters can be learnt for any fraction of Byzantine nodes.

  37. Estimation of Probabilities • For learning the behavior of the Byzantines, the FC would estimate the Byzantines’ operating point • The idea is to compare the behavior of nodes with expected behavior of Honest nodes to estimate • The final decision can be made using estimated probabilities in Chair-Varshney rule • It is shown that

  38. Distributed Detection with Byzantines(Tree Topology) • Distributed Detection in Tree Topologies with Byzantines B. Kailkhura, S. Brahma, Y. S. Han , P. K. Varshney, “Distributed Detection in Tree Topologies with Byzantines”, IEEE Trans. Sig. Process.,  volume:62 , issue: 12, pp. 3208 – 3219, June 2014.

  39. Network Architecture: Tree Topology

  40. Distributed Detection: N-P setup • Given the performance of both the honest nodes and Byzantines, what is the condition on attack configuration to totally blind the fusion center? Research Problem:

  41. Distributed Detection in Tree Topologies with Byzantines Attack is more severe : multiple attack configurations{Bk}can blind the FC. !! Challenges:

  42. The Distributed Estimation Problem Requires Design of an ESTIMATOR

  43. Example: Distributed Localization • Parameter-of-interest: Location vector • Intractable MSE: Analyze upper-bounds on MSE. • Cramer-Rao Bound: where Example: ML Estimator

  44. Distributed Estimation with Byzantines • Target Localization in Sensor Networks with Quantized Data in the Presence of Byzantine Attacks A. Vempaty, O. Ozdemir, K. Agrawal, H. Chen, and P. K. Varshney, "Localization in Wireless Sensor Networks: Byzantines and Mitigation Techniques," IEEE Trans. Signal Process., vol. 61, no. 6, pp. 1495-1508, Mar. 15, 2013.

  45. Problem Formulation • Let N sensors be randomly deployed (not necessarily in a regular grid) • Estimate the unknown location of the target at where and denote the coordinates of the target • Signal amplitude • Signal at the ith sensor is • Sensors send quantized data to FC,

  46. Problem Formulation • Assumptions • Ideal Channels between Sensors and FC • Identical Sensor Quantizers • Minimum Mean Square Error (MMSE) Estimation where u=[] is the received observation vector • Performance Metrics: PCRLB, Posterior-FIM Monte Carlo based target localization

  47. Attack Model • fraction of Byzantines in the network • For an honest node=Di • Byzantines flip their quantized binary measurements with probability ‘p’

  48. Blinding the FC • Making the FC incapable of using the data from the local sensors to estimate the target location • FC is blind when the data’s contribution to posterior Fisher Information matrix approaches zero FC is blind when : or

More Related