1 / 69

Influential People

Analysis of Boolean Functions and Complexity Theory Economics Combinatorics Etc. Slides prepared with help of Ricky Rosen. Influential People.

Download Presentation

Influential People

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis of Boolean FunctionsandComplexity TheoryEconomicsCombinatoricsEtc.Slides prepared with help of Ricky Rosen

  2. Influential People • The theory of the Influence of Variables on Boolean Functions[KKL,BL,R,M], has been introduced to tackle Social Choice problems and distributed computing. • It has motivated a magnificent body of work, related to • Sharp Threshold [F, FK] • Percolation[BKS] • Economics: Arrow’s Theorem[K] • Hardness of Approximation[DS] Utilizing Harmonic Analysis of Boolean functions… • And the real important question:

  3. Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide –not necessarily according to majority… And what ifsomeone(in Florida?)can flipsome votes influence Power

  4. Boolean Functions • Def: ABoolean function Power set of [n] Choose the location of -1 Choose a sequence of -1 and 1

  5. Noise Sensitivity • The values of the variables may each, independently, flip with probability  • It turns out: one cannot design an f that would be robust to such noise --that is, would, on average, change value w.p. < O(1)-- unless determining the outcome according to very few of the voters

  6. Voting and influence • Def: theinfluence ofi on f is the probability, over a random input x, that f changes its value when i is flipped -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 1 -1

  7. -1 1 -1 -1 ? 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 • Majority:{1,-1}n {1,-1} • Theinfluence of i on Majority is the probability, over a random input x, Majority changes with i • this happens when half of the n-1 coordinate (people) vote -1 and half vote 1. • i.e.

  8. -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 • Parity: {1,-1}n {1,-1} Always changes the value of parity

  9. -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 • Dictatorshipi:{1,-1}20 {1,-1} • Dictatorshipi(x)=xi • influence of i on Dictatorshipi= 1. • influence of ji on Dictatorshipi=0.

  10. Average Sensitivity (Total-Influence) • Def: theAverage­ Sensitivity off(as) is the sum of influences of all coordinates i  [n] : • as(Majority) = O(n½) • as(Parity) = n • as(dictatorship) =1

  11. When as(f)=1 Def: f is abalancedfunction if it equals -1 exactly half of the times:Ex[f(x)]=0 Can a balanced f have as(f) < 1? What about as(f)=1? Beside dictatorships? Prop: f isbalancedandas(f)=1f is adictatorship.

  12. Representing f as a Polynomial • What would be the monomials over x  P[n] ? • All powers except 0 and 1 cancel out! • Hence, one for each characterS[n] • These are all the multiplicative functions

  13. Fourier-Walsh Transform • Consider all characters • Given any functionlet the Fourier-Walsh coefficients of f be • thus f can be described as

  14. Norms Def:Expectation norm on the function Def:Summation norm on the transform Thm [Parseval]: Hence, for a Boolean f

  15. SimpleObservations • Def: • Claim:For any function f whose range is {-1,0,1}:

  16. Variables` Influence • Recall: influence of an index i [n] on a Boolean function f:{1,-1}n {1,-1} is • Which can be expressed in terms of the Fourier coefficients of fClaim: • And the as:

  17. Fourier Representation of influence Proof: consider the influence function which in Fourier representation is and

  18. If s s.t |s|>1 and then as(f)>1 Balanced f s.t. as(f)=1 is Dict. • Since f is balanced and • So f is linear • For any i s.t. Only i has changed

  19. Expectation and Variance • Claim: • Hence, for any f

  20. First Passage Percolation [BKS] Each edge costs a w/probability ½ and bw/probability½

  21. First Passage Percolation • Consider the Grid • For each edge e of chooseindependentlywe = 1 or we = 2, each with probability ½ • This induces a shortest-path metric on • Thm: The variance of the shortest path from the origin to vertex v is bounded from above by O( |v|/ log |v|) [BKS] • Proof idea: The average sensitivity of shortest-path is bounded by that term

  22. Graph properties Def: A graph property is a subset of graphs invariant under isomorphism. Def: a monotone graph property is a graph property P s.t. • If P(G) then for every super-graph H of G (namely, a graph on the same set of vertices, which contains all edges of G) P(H) as well. P is in fact a Boolean function:P: {-1, 1}V2{-1, 1}

  23. Examples of graph properties • G is connected • G is Hamiltonian • G contains a clique of size t • G is not planar • The clique number of G is larger than that of its complement • The diameter of G is at most s • ... etc . • What is the influence of different e on P?

  24. Erdös–Rényi G(n,p)Graph TheErdös-Rényidistribution of random graphs Put an edge between any two vertices w.p.p

  25. Definitions • P – a graph property • p(P) - the probability that a random graph on n vertices with edge probability p satisfies P. • GG(n,p) - G is a random graph of n vertices and edge probability p.

  26. Def: Sharp threshold • Sharp threshold in monotone graph property: • The transition from a property being very unlikely to it being very likely is very swift. G satisfies property P GDoes not satisfies property P

  27. Thm: every monotone graph property has a Sharp Threshold[FK] • Let P be any monotone property of graphs on n vertices . If p(P) >  then q(P) > 1- for q=p + c1log(½)/logn Proof idea: show asp’(P), for p’>p, is high

  28. Thm [Margulis-Russo]: For monotonef

  29. Proof [Margulis-Russo]:

  30. Mechanism Design Problem • Nagents, each agent i hasprivateinput tiT. All other information ispublicknowledge. • Each agent i has avaluationfor all items: Each agent wishes to optimize her own utility. • Objective: minimize the objective function, the total payment. • Means: protocol between agents and auctioneer.

  31. Vickrey-Clarke-Groves (VCG) • Sealed bid auction • A Truth Revealing protocol, namely, one in which each agent might as well reveal her valuation to the auctioneer • Whereby each agent gets the best (for her) price she could have bid and still win the auction

  32. Shortest Path using VGC • Problem definition: • Communication networkmodeled by a directed graph G and two vertices source s and target t. • Agents= edges in G • Each agent has a cost for sending a single message on her edge denote by te. • Objective: find the shortest (cheapest) path from s to t. • Means: protocol between agents and auctioneer.

  33. 10$ 10$ 50$ 50$ VCG for Shortest-Path Always in the shortest path

  34. How much will we overpay? • SP • Every agent gets an extra $1 • Thm[Mahedia,Saberi,S]: expected extra pay is asSP (G) 1$ 2$ 2$ 1$ 1$ 1$ 1$ 1$ 2$ 1$ 2$ 1$ 1$ 2$

  35. Learning Alg Learning Functions • To learn a function f:GnG is to sample it in poly(n|G|/) points and come up with some f’ that differs with f on at most an fraction of the points • Membership: choose your points • Random: w.h.p. over values for a small random set of points • Applications (when learnable): bioinformatics, economy, etc. • Also [Akavia, Goldwasser, S.]: cryptography -- hardcore predicates via list-decoding

  36. weight characters …-5 -3 -1 1 3 5… Concentrated • Def: the restrictionof f to  is • Def: f is a concentrated function if >0,  of poly(n/) size s.t. • Thm [Kushilevitz, Mansour]: f:{0,1}n{0,1} concentrated is learnable • Thm[Akavia, Goldwasser, S.]: over any Abelian group f:GnG

  37. -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 -1 Juntas • A function is a J-junta if its value depends on only J variables. 1 -1 -1 1 1 1 -1 -1 1 -1 1 1 -1 1 -1 1 • A Dictatorship is 1-junta -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1

  38. -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 Juntas • A function is a J-junta if its value depends on only J variables. 1 -1 -1 1 1 1 -1 -1 1 -1 1 1 -1 1 -1 1 • Thm [Fischer, Kindler, Ron, Samo., S]: Juntas are testable • Thm [Kushilevitz, Mansour; Mossel, Odonel]: Juntas are learnable

  39. I  - Noise sensitivity Choose a subset, I, of variables Each var is in the set with probability  Flip each value of the subset, I with probability p • The noise sensitivity of a function f is the probability that f changes its value when flipping a subset of its variables according to the p distribution. What is the new value of f? -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 -1 -1 1 1 -1

  40. Choose a subset (I) of variables Each var is in the set with probability  Junta I Noise sensitivity and juntas Flip each value of the subset (I) with probability p • Juntas are noise insensitive (stable) Thm [Bourgain; Kindler & S]: Noise insensitive (stable) Boolean functions are Juntas What is the new value of f? W.H.P STAY THE SAME -1 1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 -1 1 1 1 -1 1 -1 -1 1 1 -1

  41. Freidgut Theorem Thm: any Boolean f is an [, j]-junta for Proof: • Specify the junta J • Show the complement ofJ has little influence

  42. Long-Code In the long-code the set of legal-words consists of all monotone dictatorships This is the most extensive binary code, as its bits represent all possible binary values over n elements

  43. Long-Code • Encoding an element e[n]: • Eelegally-encodes an element e if Ee = fe T F F T T

  44. Open Questions • Hardness of Approximation: • MAX-CUT • Coloring a 3-colorable graph with fewest colors • Graph Properties: find sharp-thresholds for properties • Circuit Complexity: switching lemmas • Mechanism Design: show a non truth-revealing protocol in which the pay is smaller (Nash equilibrium when all agents tell the truth?) • Analysis: show weakest condition for a function to be a Junta • Learning: by random queries • Apply Concentration of Measure techniques to other problems in Complexity Theory

  45. Of course they’ll have to discuss it over dinner…. Where to go for Dinner? The alternatives Diners would cast their vote in an (electronic) envelope The system would decide –not necessarily according to majority… And what ifsomeone(in Florida?)can flipsome votes Form a Committee influence Power

  46. Low-degree B.f are Juntas Corollary: fix a p-biased distribution p overP([n])Let >0 be any parameter. Set k=log1-(½)Then  constant >0 s.t. any Boolean function f:P([n]){-1,1} satisfying is an [,j]-junta for j=O(-2k32k)

  47. [n] [n] I I z x Noise-Sensitivity • Def(,p,x[n] ): Let 0<<1, and xP([n])Then y~,p,x, if y = (x\I) z where • I~[n] is a noise subset, and • z~ pI is a replacement. Def(-noise-sensitivity): let 0<<1, then [ When p=½ equivalent to flipping each coordinate in x independently w.p. /2.]

  48. Noise-Sensitivity – Cont. • Advantage: very efficiently testable (using only two queries) by a perturbation-test. • Def(perturbation-test): choose x~p, and y~,p,x, check whether f(x)=f(y)The success is proportional to the noise-sensitivity of f. • Prop: the -noise-sensitivity is given by

  49. Relation between Parameters Prop: small ns small high-freq weight Proof: therefore: if ns is small, then Hence the high frequenciesmust have small weights (as ). Prop: small as small high-freq weight Proof:

More Related