1 / 21

Brun’s Sieve

Brun’s Sieve Let B 1 , …, B m be events, X i the indicator random variable for Bi and X = X 1 + … + X m the number of Bi that hold. Let there be a hidden parameter n (so that actually m = m ( n ), B i = B i ( n ), X = X( n )) which will define the following o , O notation.

fuller
Download Presentation

Brun’s Sieve

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Brun’s Sieve • Let B1, …, Bm be events, Xi the indicator random variable for Bi and X = X1 + … + Xm the number of Bi that hold. • Let there be a hidden parameter n (so that actually m = m(n), Bi = Bi(n), X = X(n)) which will define the following o, O notation. • Define S(r) = ∑ Pr[Bi1Λ…ΛBir ], the sum over all sets {i1,…,ir}  {1,…,m}.

  2. Theorem 8.3.1 Suppose there is a constant μso that E[X] = S(1) → μ and such that for every fixed r, E[X(r) / r!] = S(r) →μr/ r!. Then Pr[X = 0] → and indeed for every t Pr[X = t] →

  3. Pr[X = r] ≤ S(r) = ∑ Pr[ ], where {i1,…,ir}  {1,…,m}. • The Inclusion-Exclusion Principle gives that Pr[X = 0] = Pr[ ] = 1 – S(1) + S(2) -…+(-1)rS(r)… • Bonferroni’s inequality: Let P(Ei) be the probability that Ei is true, and be the probability that at least one of Ei,…, En is true. Then

  4. Proof. We do only the case t = 0. • Fix є> 0. Choose s so that • The Bonferroni Inequalities states that, in general, the inclusion-exclusion formula alternatively over and underestimates Pr[X = 0]. In particular, • Select no(the hidden variable) so that for n  no, for 0 ≤ r ≤ 2s.

  5. Proof(cont.) • For such n Pr[X = 0] ≤ + є • Similarly, taking the sum to 2s+1 we find no so that for n  no Pr[X = 0] ≤ - є • As є wasarbitrary Pr[X = 0] →

  6. Let G~ G(n,p), the random graph and let EPIT represent the statement that every vertex lies in a triangle. • Theorem 8.3.2 Let c > 0 be fixed and let p = p(n),μ=μ(n) satisfy p3 = μ, = Then Pr[G(n,p) |= EPIT] =

  7. Proof. • First fix xV(G). • For each unordered y, z V(G) – {x} let Bxyz be the event that {x,y,z} is a triangle of G. • Let Cx be the event and Xx be the corresponding indicator random variable. • We use Janson’s Inequality to bound E[Xx] = Pr[Cx]. Here p = o(1) so є = o(1). as defined above.

  8. Proof(cont.) • Dependency xyz ~ xuv occurs if and only if the sets overlap (other than x). Hence • Since . Thus • Now define the number of vertices x not lying in a triangle. Then from Linearity of Expectation,

  9. Proof(cont.) • We need to show that the Poisson Paradigm applies to X. Fix r. Then the sum over all sets of vertices {x1,…,xr}. All r-sets look alike so where x1,…,xr are some particular vertices. But the conjunction over 1 ≤ i ≤ r and all y,z.

  10. Proof(cont.) • We apply Janson’s Inequality to this conjunction. Again є = p3 = o(1). • The number of {xi,y,z} is , the overcount coming from those triangles containing two(or three of the xi). (Here it is crucial that r is fixed.) Thus As before Δ is p5 times the number of pairs xiyz~ xjy’z’. There are O(rn3) = O(n3) terms with i=j and O(r2n2) = O(n2) terms with ij so again Δ = o(1). Therefore and

  11. Large Deviations • Given a point in the probability space(i.e., a selection of R) we call an index set J I a disjoint family (abbreviated disfam) if • Bjfor every jJ. • For no j, j’ J is j ~ j’. If, in addition, • If j’J and Bj’ thenj ~ j’ for some j J. Then we call J a maximal disjoint family (abbreviated maxdisfam).

  12. Lemma 8.4.1 With the above notation and for any integer s, Pr[there exists a disfam J, |J| = s] ≤ • Proof. • Let denote the sum over all s-sets JI with no j ~ j’. • Let denote the sum over ordered s-tuples (j1 ,…, js) with {j1 ,…, js} forming such a J. • Let denote the sum over all ordered s-tuples (j1,…, js).

  13. Proof(cont.) • Pr[there exists a disfam J, |J| = s]

  14. For smaller s we look at the further condition of J being a maxidisfam. To that end we let μs denote the minimum, over all j1, … , js of ,the sum taken over all iI except those i with i ~ jl for some 1≤l≤s. • In application s will be small (otherwise we use Lemma 8.4.1) and μs will be close to μ. For some applications it is convenient to set and note that μs >= μ – sv.

  15. Lemma 8.4.2 With the above notation and for any integer s, Pr[there exists a maxdisfam J, |J| = s] ≤ • Proof. • As in Lemma 8.4.1 we bound this probability by of J = {j1, … , js} being a maxdisfam. For this to occur J must first be a disfam and then , where is the conjunction over all i I except those with i ~ jl for some 1 ≤l ≤ s.

  16. Proof(cont.) • We apply Janson’s Inequality to give an upper bound to .The associated values satisfy the latter since has simply fewer addends. Thus and

  17. When Δ = o(1) and vμ = o(1) or, more generally, μ3μ = μ + o(1), then Lemma 8.4.2 gives a close approximation to the Poisson Distribution since Pr[there exists a maxdisfam J, |J| = s] For s≤ 3μ and the probability is quite small for larger s by Lemma 8.4.1

More Related