280 likes | 391 Views
Network Theory and Dynamic Systems Information Cascades - Bayes. Prof. Dr. Steffen Staab. Bayes ‘ rule. Terminology. P[A] Prior Probability (also Margin) P[ A|B] Posterior Probability. Example for Bayes ‘ rule. Eyewitness to accident involving taxi
E N D
Network Theory and Dynamic SystemsInformation Cascades - Bayes Prof. Dr. Steffen Staab
Terminology P[A] Prior Probability (also Margin) P[A|B] PosteriorProbability
ExampleforBayes‘ rule • Eyewitnesstoaccidentinvolvingtaxi • 80% oftaxisareblack: P[true=Y] = 0.8 • 20% oftaxisareblack: P[true=B] = 0.2 • Eyewitnessunreliable: • P[report=Y|true=Y] = 0.8 • Implies P[report=B|true=Y] = 0.2 • P[report=B|true=B] = 0.8 • Implies P[report=Y|true=B] = 0.2
Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether
Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether
Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether
Side Remarks P[true=Y | report=Y] isonly 0.5 False positives heavilyinfluencethisresult. Most people do not expect such heavy influencebyfalse positives. Especially in medicaltreatmentthishasbeenshowntobehighlyproblematic, becausedoctorsareequallybadathandling such conditionalprobabilitieswell (cf researchbypsychologists, Gerd Gigerenzerandteam)
Bayes‘ rule in theherdingexperiment • Individual objectivetoberewarded • Guess „blue“ ifandonlyif • P[majority-blue | whatseenandheard] > ½ • How? • Priors: • P[majority-blue] = P[majority-red] = 0.5 • Posteriors • P[blue|majority-blue] = P[red|majority-red] = 2/3
First student • P[majority-blue | blue] = 2/3 Prior: 1/2 Posterior: 2/3 Margin: 1/2
Second student – assumingfirstsaid „blue“ • Trustingthatstudent 1 behavesrationally • New Priors: • P[majority-blue] = 2/3 • P[majority-red] = 1/3 • Posteriorsremainunchanged • P[blue|majority-blue] = P[red|majority-red] = 2/3 (4/9) / (5/9) = 4/5 = 0.8 Prior: 2/3 Posterior: 2/3 Margin: 5/9 2/3*2/3+1/3*1/3=5/9
Second student – assumingfirstsaidtrue • Alternative wayofmodelingtheproblem • Lookingfor • P[majority-blue | blue, blue] 1/2 4/9 Independent Events!
Second student – assumingfirstsaidtrue • Alternative wayofmodelingthe same problem • Lookingfor • P[majority-blue | blue, blue] 5/18
Second student – assumingfirstsaidtrue • Alternative wayofmodelingthe same problem • Lookingfor • P[majority-blue | blue, blue] 1/2 4/9 5/18
Third student – assumingred (after twoblue) • Lookingfor
Simple, General Cascade Model • Group ofpeople (numbered 1,2,3,...) sequentiallymakingdecisions • Eachperson: acceptingorrejecting an option • Adopttechnology • Wearnewfashion • Eat in specificrestaurant • Commit crime • Votefor a politicalparty • Chooseholidaydestination • ...
Simple, General Cascade Model - Ingredients • State ofthe World: Initial random, unobservableeventdetermineswhetheracceptingorrejectingisbetter • G: Acceptingisgood • B: Acceptingisbad • Priors: P[G]=p, P[B]=1-p
Simple, General Cascade Model - Ingredients • State ofthe World: G, B, Priors: P[G]=p, P[B]=1-p • Payoffs: • Payoffforrejecting: 0 • Payoffforaccepting: • If G thenpayoffisvg, wherevg>0 • If B thenpayoffisvb, wherevb<0 • Expectedpayoffinitially 0 p*vg + (1-p)*vb=0
Simple, General Cascade Model - Ingredients • State ofthe World: G, B, Priors: P[G]=p, P[B]=1-p • Payoffs: vg, vb, p*vg + (1-p)*vb=0 • Signals: modeling private, but uncertaininformation • High signal: H, suggestingthatacceptingisgood • Low signal: L, suggestingthatacceptingisbad • If G then high signalsaremorefrequentthanlowsignals: • P[H|G]=q > ½ andP[L|G]=1-q < ½ • IfB then high signalsarelessfrequentthanlowsignals: • P[L|B]=q > ½ and P[H|B]=1-q < ½ • Probability Matrix
Individual Decisions - General • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L Individual Decision After First Signal
Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Hypothesestobeverified:
Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Becauseofconditionalindependencemultiplyingprobabilities:
Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Lookingfor: Usingpreviousslide
Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • S witha H signals, b L signals ?<> If a>b then „<„ becauseq>½>(1-q) implying P[G|S]<p=P[G] If a<b then „>“ implyingP[G|S]<p=P[G] If a=b then P[G|S]=p=P[G] =
SequentialDecision Making andCascades • Person 1 follows private signal • Person 2 getstwosignals • a clearonefromperson 1 • an ownone • Person 3 hasthreeindependent, clearsignals • Person 3 will followthemajorityvote
Long termimplications In order not tostart a cascadeat all, theremayneverbethree same signals in a row However: probabilitytohavethree same signalsgoesto 1 aswehavemoreandmoredecisions • Forthreepeople in a rowtheprobabilityofhavingthree same signalsis: q3 + (1-q)3 • For 3N peopletheaggregatedprobabilityofneverhavingthreesignals in a rowis (1- q3 - (1-q)3)Nwhichgetsassmallasyouwant, ifyoumake N large enough
LessonsfromCascades • Cascadescanbewrong: • wrongchoicesmadeinitiallybecauseofrandomlyincorrectsignalsmaystart a cascade • Cascadescanbebased on verylittleinformation • People ignoretheir private informationonce a cascadestarts • Cascadesare fragile • As theystartwithlittleinformation, theycan also bestoppedwithlittleinformation • E.g. someonereceivingtwo private signalsmaydecidetoletthemoverrulethetwoothersignalsthatstartedthecascale