400 likes | 435 Views
SIGNALING GAMES: Dynamics and Learning. NASSLI 2016 Tuesday. The Flow of Information in Signaling Games. “In the beginning was information. The word came later.” Fred Dretske Knowledge and the Flow of Information. My gloss in Signals (2010).
E N D
SIGNALING GAMES:Dynamics and Learning NASSLI 2016 Tuesday
“In the beginning was information. The word came later.” Fred Dretske Knowledge and the Flow of Information
My gloss in Signals (2010) “Dretske was calling for a reorientation in epistemology. He did not think that epistemologists should spend their time on little puzzles or on rehashing ancient arguments about skepticism. Rather, he held that epistemology would be better served by studying the flow of information.”
Sender-Receiver Games • Nature picks a state with some probability • Sender picks a signal with probability conditional on state observed • Receiver picks an act with probability conditional on signal received • At any state of the system, equilibrium or not, there is a well-defined joint distribution on state, signal act. There are straightforward generalizations to signaling networks.
How to measure information about a state? • Key Quantity prSiG (state)/pr(state) (The numerator being the probability of the state conditional on the signal.)
If the signal tells us nothing the information should be zero Log[prSiG(state)/pr(state)] Aczel and Daroczy (1975) On measures of Information and their Characterization
Informational Content of a Signal • Informational content is a vector. • Quantity of information is a scalar.
Quantity of information in a signal about the states Istates (signal) = ∑i prsig(state i ) log[prsig(state i)/pr(state i)]
This is the Kullback-Leibler Divergence Solomon Kullback Kullback and Leibler 1951, Kullback 1959, Lindley 1956
figure 3.2: Information as a function of probability of state 1 given signal, states initially pr .6,.4
Extension to a little Network • Nature chooses one of four states by independently flipping two fair coins. Coin 1 determines up of down – let us say – and coin 2 determined left or right. The four states, up-left etc., are equiprobable. There are now two senders. Sender 1 can only observe whether nature has chosen up or down; sender 2 observes whether it is left or right. Each sends one of two signals to the receiver [(R,G),(B,Y)]. •→•←•
Suppose Senders have deterministic strategies: Sender 1: Up => Red Down=>Green Sender 2: Left => Blue Right => Yellow Each signal carries 1 bit of information, The combination of signals carries two bits.
Information about the Act • Defined in an entirely parallel manner: Iacts(signal) = ∑i prsig(act i ) log[prsig(act i)/pr(act i)] Can differ from quantity of information about states.
Informational Content of a Signal Informational content about states: < log[prsig(state 1)/pr(state 1)], log[prsig(state 2)/ pr(state 2)], .... > (Likewise for acts)
Example • Suppose that there are four states, initially equiprobable, and signal 2 is sent only in state 2. Then the informational content about states of signal 2 is: • IStates(Signal 2) = < -∞, 2, -∞, -∞> • The -∞ components tell you that those states end up with probability zero.
Philosopher’s Objection • “But shouldn’t the content – at least the declarative content – of a signal be a proposition? And isn’t a proposition a set of possible worlds or situations?”
Reply • States may be individuated as finely as you please. • Proposition can be specified by the “possible worlds” ruled out. • That is just what the -∞ components of the information vector do.
Example • 4 states • Signal “tells you” that it is state 2 or state 4 • Content vector: IStates(Signal) = < -∞, __, -∞, __> The content vector gives a richer account of meaning.
Objective and Subjective Information • The information so far is objective. • The probabilities are propensities of nature and of sender and receiver in some state of the system. • If senders and/or receiver’s have degrees of belief for all this, there are correlative notions of subjective information.
Flow of Information •→•→• Suppose: S1 => R => B => A1 S2 => G => Y => A2 Players 1 and 2 use different languages. The informational content of B (said by player 2) is the same as the informational content of R (said by player 1).
Deception A naturalistic account
Is deception possible? “I can by no means will that lying should be a universal law. For with such a law there would be no promises at all, since it would be in vain to allege my intention in regard to my future actions to those who would not believe this allegation …” - Immanuel Kant Groundwork for the Metaphysics of Morals
Deception in Nature • Female Photuris firefly devours a Photinus
What is Deception? • A signal that raises the probability of a state that is not the true state carries misinformation. • A signal that is systematically sent to the benefit of the sender and the detriment of the receiver is deception.
Deception by “half-truth” 3 states • Receptive mate • Predator • Nothing shaking Mating signal raises the probabilities of states 1 & 2, and lowers the probability of state 3.
Universal Deception can be Good for You • Suppose you are hald the time in the role of sender and half the time in the role of receiver in the preceding game. • Then you would prefer deception as universal law.
Deception is Impossible in Equilibrium (Kant’s revenge) but…
Chaos(structurally stable) Wagner BJPS 2012, Sato Akiyama, Farmer, PNAS 2002.
Lexical Content? Ruth Millikan Peter Godfrey-Smith (2012) Jonathan Birch (2014)