240 likes | 323 Views
( Deterministic ) Communication amid Uncertainty. Madhu Sudan Microsoft, New England. Based on joint works with: (1) Adam Kalai (MSR), Sanjeev Khanna ( U.Penn ), Brendan Juba (Harvard ) a nd (2) Elad Haramaty ( Technion ). Classical Communication. The Shannon setting
E N D
(Deterministic) Communication amid Uncertainty MadhuSudan Microsoft, New England Based on joint works with: (1) Adam Kalai(MSR), SanjeevKhanna(U.Penn), Brendan Juba (Harvard) and (2)EladHaramaty(Technion) MSR-I: Deterministic Communication Amid Uncertainty
Classical Communication • The Shannon setting • Alice gets chosen from distribution • Sends some compression to Bob. • Bob computes • (with knowledgeof ). • Hope . • Classical Uncertainty: • Today’s talk: Bob knows . MSR-I: Deterministic Communication Amid Uncertainty
Outline Part 1: Motivation Part 2: Formalism Part 3: Randomized Solution Part 4: Issues with Randomized Solution Part 5: Deterministic Issues. MSR-I: Deterministic Communication Amid Uncertainty
Motivation: Human Communication • Human communication vs. Designed communication: • Human comm. dictated by languages, grammars … • Grammar: Rules, often violated. • Dictionary: multiple meanings to word. • Redundant: But error-correcting code. • Theory for human communication? • Information theory? • Linguistics? (Universal grammars etc.)? MSR-I: Deterministic Communication Amid Uncertainty
Behavioral aspects of natural communication • (Vast) Implicit context. • Sender sends increasingly long messages to receiver till receiver “gets” (the meaning of) the message. • Where do the options come from? • Comes from language/dictionary – but how/why? • Sender may use feedback from receiver if available; or estimates receiver’s knowledge if not. • How does estimation influence message. MSR-I: Deterministic Communication Amid Uncertainty
Model: • Reason to choose short messages: Compression. • Channel is still a scarce resource; still want to use optimally. • Reason to choose long messages (when short ones are available): Reducing ambiguity. • Sender unsure of receiver’s prior (context). • Sender wishes to ensure receiver gets the message, no matter what its prior (within reason). MSR-I: Deterministic Communication Amid Uncertainty
Back to Problem • Design encoding/decoding schemes () so that • Sender has distribution on • Receiver has distribution on • Sender gets • Sends to receiver. • Receiver receives • Decodes to • Want: (provided close), • While minimizing MSR-I: Deterministic Communication Amid Uncertainty
Contrast with some previous models • Universal compression? • Doesn’t apply: P,Q are not finitely specified. • Don’t have a sequence of samples from P; just one! • K-L divergence? • Measures inefficiency of compressing for Q if real distribution is P. • But assumes encoding/decoding according to same distribution Q. • Semantic Communication: • Uncertainty of sender/receiver; but no special goal. MSR-I: Deterministic Communication Amid Uncertainty
Closeness of distributions: is -close to if for all , -close to (symmetrized, “worst-case” KL-divergence) MSR-I: Deterministic Communication Amid Uncertainty
Dictionary = Shared Randomness? • Modelling the dictionary: What should it be? • Simplifying assumption – it is shared randomness, so … • Assume sender and receiver have some shared randomness and are independent of . • Want MSR-I: Deterministic Communication Amid Uncertainty
Solution (variant of Arith. Coding) • Use to define sequences • … • where chosen s.t. Either Or MSR-I: Deterministic Communication Amid Uncertainty
Performance • Obviously decoding always correct. • Easy exercise: • ( “binary entropy”) • Limits: • No scheme can achieve • Can reduce randomness needed. MSR-I: Deterministic Communication Amid Uncertainty
Implications • Reflects the tension between ambiguity resolution and compression. • Larger the ((estimated) gap in context), larger the encoding length. • Coding scheme reflects the nature of human process (extend messages till they feel unambiguous). • The “shared randomness’’ is a convenient starting point for discussion • Dictionaries do have more structure. • But have plenty of entropy too. • Still … should try to do without it. MSR-I: Deterministic Communication Amid Uncertainty
Deterministic Compression? • Randomness fundamental to solution. • Needs independent of to work. • Can there be a deterministic solution? • Technically: Hard to come up with single scheme that compresses consistently for all (). • Conceptually: Nicer to know “dictionary” and context can be interdependent. MSR-I: Deterministic Communication Amid Uncertainty
Challenging special case • Alice has permutation on • i.e., 1-1 function mapping • Bob has permutation • Know both are close: • (say ) • Alice and Bob know (say ). • Alice wishes to communicate to Bob. • Can we do this with few bits? • Say bits if . MSR-I: Deterministic Communication Amid Uncertainty
Model as a graph coloring problem X • Consider family of graphs • Vertices = permutations on • Edges = close permutations with distinct messages. (two potential Alices). • Central question: What is ? MSR-I: Deterministic Communication Amid Uncertainty
Main Results [w. EladHaramaty] • Claim: Compression length for toy problem • Thm 1: • ( times) • . • Thm 2: uncertain comm. schemes with (-error). • -error). • Rest of the talk: Graph coloring MSR-I: Deterministic Communication Amid Uncertainty
Restricted Uncertainty Graphs X • Will look at • Vertices: restrictions of permutations to first coordinates. • Edges: MSR-I: Deterministic Communication Amid Uncertainty
Homomorphisms • homomorphic to () if • Homorphisms? • is -colorable • Homomorphisms and Uncertainty graphs. • Suffices to upper bound MSR-I: Deterministic Communication Amid Uncertainty
Chromatic number of For , Let Claim: is an independent set of Claim: Corollary: MSR-I: Deterministic Communication Amid Uncertainty
Better upper bounds: Say Lemma: For MSR-I: Deterministic Communication Amid Uncertainty
Better upper bounds: MSR-I: Deterministic Communication Amid Uncertainty • Lemma: • For • Corollary: • Aside: Can show: • Implies can’t expect simple derandomization of the randomized compression scheme.
Future work? • Open Questions: • Is ? • Can we compress arbitrary distributions to ? or even • On conceptual side: • Better mathematical understanding of forces on language. • Information-theoretic • Computational • Evolutionary MSR-I: Deterministic Communication Amid Uncertainty
Thank You MSR-I: Deterministic Communication Amid Uncertainty