180 likes | 198 Views
This research delves into consensus protocols in network systems, exploring cooperative and non-formation control, analyzing convergence patterns, and addressing binary consensus problems model. The study examines the impact of noise, probability distributions, and decision-making strategies on achieving consensus in network communication dynamics.
E N D
Consensus Problems in Networks Aman Agarwal EYES 2007 intern Advisor Prof. Mostofi ECE, University of New Mexico July 5, 2007
Background • Cooperative control for multi agent systems have a lot of applications. • Formation control • Non formation cooperative control • Key issue is shared information
Consensus Protocols • xi be the information state of agent i • Continuous time : • x’i(t) = Σ aij (xj - xi ) x’(t) = - L x(t) x(t) = e-Lt x(0) ; Lt t→∞ e-Lt→1vT ; vT1 = 1 & vTL = 0 x*(t) = 1vTx(0); where v is the eigen vector corr to eigen value 0 • Discrete time : • xi [k+1] = Σ aij[k] xi[k] x[k+1] = D[k] x[k] x[k+1] = Dk x[0]; Lt k→∞ Dk→1 vT; vT1 = 1 & vTD = vT x*(t) = 1vTx(0); where v is the eigen vector corr to eigen value 1 • L has an eigen value 0 corresponding to the solution and D has an eigen value 1 corresponding to the solution
Convergence of Consensus Protocols • Equilibrium value: function of the initial state. • Agents that can pass info to all the other vehicles have a say in the final value. • Second smallest eigen value (L) or second largest eigen value(-L) or Fiedler eigen value: • Determines the speed of convergence • Dense graphs λ2 is relatively large. • Sparse graphs λ2 is relatively small. • The third smallest eigen value (of L) should be far away from λ2 for faster convergence. • Multiple values of λ2 also affect the speed of convergence. • Ideally we would like to have λ2 as a simple eigen value for fast convergence.
Binary Consensus Problems • In most consensus applications, the agents will communicate their status wirelessly. • On the bit level there is receiver noise. • Noise is not bounded no transition point beyond which consensus is guaranteed a probabilistic approach to characterize and understand the behavior of the network. • To examine this effect we look at binary consensus problems. • Assume that the network is fully connected. • A majority poll to assert if the majority of the nodes are in consensus and node updates its own information
Binary Consensus Problems Model 1 noise decision • bj(k+1) = Dec( Σ bj,i(k)/ M ) ; Dec(x) = 1 x 0.5 0 x < 0.5 = Dec( Σ bj(k)/ M + Σ nj,i(k)/ M ) = Dec( Σ S(k)/ M + wi(k) ) • S(k) = state of the system at time k = Σ bj(k) • i(k) = probability [state S(k) = i] • (k) = [ 0(k) 1(k) … n(k) ] ; probability vector • Pij = probability [ S(k) = j | S(k) = i ] = MCj kij (1-ki)M-j; where ki = prob[ i / M + wi(k) > 0.5 | S(k)=i] • (k+1) = PT (k) ; P = [Pij ] • (k) = ( PT )k(0) ; asymptotic behavior of probabilities bj(k) bji(k) bj(k+1)
Model 1: M=4 & X(0)=[0 1 1 1] Probability plot for Model 1 with sigma = 0.5 Probability plot for Model 1 with sigma = 0.75 Probability plot for Model 1 with sigma = 1 Probability plot for Model 1 with sigma = 2
Binary Consensus Problems Model 2(a) & 2(b) Noise noise filtering decision • bj,i D(k) = Dec( bj,i(k) ) ; Dec(x) = 1 x th (normally 0.5) 0 x < th • bj(k+1) = Dec( Σ bj,i D(k)/ M ) ; Dec(x) = 1 x 0.5 0 x < 0.5 = Dec( Σ Dec( bj,i(k) )/ M ) • Pij = probability [ S(k) = j | S(k) = i ] = MCj kij (1-ki)m-j; where ki = prob[ Σ Dec( bj,i(k) )/ M > 0.5 | s(k)=i] M = Σ MCl P[ bj,i D(k) = 1 ]j P[ bj,i D(k) = 0 ]m-j L = m/2 • P[bj,i d(k) = 1] = P[bj,i d(k)=1| bj(k)=1]*P[bj(k) = 1]+P[bj,i d(k)=1| bj(k)=0]*P[bj(k) = 0] = i / m + Q(0.5/)(1 - 2i / M) • P[bj,i d(k) = 0] = P[bj,i d(k)=0| bj(k) =1]*P[bj(k) = 1]+P[bj,i d(k)=0| bj(k)=0]*P[bj(k) = 0] = 1- i / M - Q(0.5/)(1 - 2i / M) bj(k) bji(k) bji D(k) bj(k+1)
Model 2(a): The noise is filtered first by thresholding the received values at threshold level of 0.5 to ensure that the majority decision is made on correct data only. M=4 & X(0)=[0 1 1 1] Probability plot for Model 2(a) with sigma = 0.5 Probability plot for Model 2(a) with sigma = 0.75 Probability plot for Model 2(a) with sigma = 1 Probability plot for Model 2(a) with sigma = 2
Model 2(b):In this case the threshold for the comm. noise is dynamically chosen by monitoring the values that the nodes are sending and then updating the threshold based on the differential probabilities of sending a 1 or a 0.
Model 2(b): M=4 & X(0)=[0 1 1 1] Probability plot for Model 2(b) with sigma = 0.5 Probability plot for Model 2(b) with sigma = 0.75 Probability plot for Model 2(b) with sigma = 1
Binary Consensus Problems Model 3 Noise noise filtering soft info decision • bj,i D(k) = Dec( bj,i(k) ) ; Dec(x) = 1 x th (normally 0.5) 0 x < th • bj(k+1) = Dec( Σ E[ bj(k) | bj,i(k) ] / M ) ; Dec(x) = 1 x 0.5 0 x < 0.5 Where E[ bj(k) | bj,i(k) ] = f( bj,i(k) - 1 ) * P[ bj(k) = 1 ] f( bj,i(k) - 1 ) * P[ bj(k) = 1 ] + f( bj,i(k) ) * P[ bj(k) = 0 ] And f(x) = pdf of N ( 0 , 2 ) • Pij = probability [ s(k) = j | s(k) = i ] ; finding the probability of transition becomes very tedious and complex in this case so we simulate the case and calculate the probability statistically by taking a lot of samples ( min 1000 ) bj(k) bji(k) bji D(k) E[bj(k)|bji D(k)] bj(k+1)
Model 3 Probability plot for Model 2 with sigma = 0.5 Probability plot for Model 3 with sigma = 0.75 Probability plot for Model 3 with sigma = 1 Probability plot for Model 3 with sigma = 2
Comparison of models • Model 1 performance sharply degrades for larger noise variances (sigma > 0.5). • Model 2(a) Better than model 1 but can’t handle large noise variances (sigma > 1). • Model 2(b) better than model 2(a). The dynamic threshold works but only if the noise variance is < 1, because for larger noises a threshold between 0,1 will not work. • Model 3 is very robust and can perform with large noises also (sigma >1) but we trade off speed of convergence for handling larger noises.
Detection & Estimation • A group of nodes where each node has limited sensing capabilities rely on the group for improving its estimation/detection quality. • Estimation each agent has an estimate of the parameter of interest which can take values over an infinite set or a known finite set. • Detection parameter of interest takes values from a finite known set
Sensing noise noise filtering decision comm. noise noise filtering decision Binary Detection Oj(k+1) S Sj(k) Ŝj(k) Oj(k) Oji(k) Oj D(k) For k ≥ 1 • Ŝj(k) ; event sensed at time k • Oj(k); opinion formed at time k • Oji(k) = Oj(k) + σn ji • Oi D(k) = Dec ( Oji(k) ) ; Dec(x) = 1 x ≥ 0.5 • 0 x < 0.5 • Oj(k+1) = Dec( ( ∑Oi D(k) + Oj(k) + Ŝj(k) ) / M+1 ) σn = 0.5 , σs = 1 & S=1
Binary Detection • Every node has M + 1 different values to weigh every time • Weigh nodes with better communication or better sensing differently • Define trust factor • Trust factors either time invariant or time variant • Should update themselves over time
Binary Detection Trust factors: one way of implementing this is as follows: Different weights Average consensus How nodes with good sensing and good communication affect the consensus X(0) = [0 0 0 0 0 1 1 1]