110 likes | 122 Views
Learn about the Golden Ratio, its historical significance, and its relation to optimal neural information processing. Discover how the Golden Ratio is connected to logarithmic spirals and golden rectangles. Explore the applications of the Golden Ratio in neuron models and information systems. Download the reprint or preprint of this informative resource.
E N D
Title Golden Ratio and Optimal Neural Information Reprint/Preprint Download at:http://www.math.unl.edu/~bdeng
intro • + f2 = 1 • = Golden Ratio f : 1 f2 f
intro 1 f3 f2 f • Pythagoreans (570 – 500 B.C.) were the first to know • that the Golden Ratio is an irrational number. • Euclid (300 B.C.) gave it a first clear definition as • ‘ the extreme and mean ratio’. • Pacioli’s book ‘The Divine Proportion’ popularized • the Golden Ratio outside the math community (1445 – 1517). • Kepler (1571 – 1630) discovered the fact that • Jacques Bernoulli (1654 – 1705) made the connection • between the logarithmic spiral and the golden rectangle. • Binet Formula (1786 – 1856) • Ohm (1835) was the first to use the term • ‘Golden Section’.
Neurons models Neurons Models 1 f 3T 1T Rinzel & Wang (1997) Bechtereva & Abdullaev (2000) time (1994)
seedtuning SEED Implementation Encode Signal Channel Decode Spike Excitation Encoding & Decoding(SEED) 3 2 4 3 3 2 2 1 1 3 … Mistuned
Bit rate Entropy Information System Alphabet: A = {0,1} Message: s = 11100101… Information System: Ensemble of messages, characterized by symbol probabilities: P({0})= p0 , P( {1})= p1 In general, if A = {0, …, n-1}, P({0}) = p0 ,…, P({n –1}) = pn –1, then each average symbol contains E(p) = (– p0 ln p0 – … – pn –1 ln pn –1 ) / ln 2 bits of information, call it the entropy. Example: Alphabet: A = {0, 1}, w/ equal probabilityP({0})=P({1})=0.5. Message: …011100101… Then each alphabet contains E = ln 2 / ln 2 = 1 bit of information Probability for a particular message s0… sn –1 is ps0… psn = p0# of 0s p1# of 1s, where # of 0s + # of 1s = n The average symbol probability for a typical message is (ps0… psn )1/n= p0(# of 0s) / np1(# of 1s) / n ~ p0p0p1p1 Entropy Let p0 = (1/2)log ½ p0 = (1/2)-lnp0/ln 2 , p1 = (1/2)log ½ p1 = (1/2)-lnp1/ln 2 Then the average symbol probability for a typical message is (ps0… psn ) 1/n ~ p0p0p1p1= (1/2)(– p0lnp0 – p1lnp1) /ln 2:= (1/2)E( p0) By definition, the entropy of the system is E(p) =(– p0ln p0– p1ln p1) / ln 2in bits per symbol
Bit rate Golden Ratio Distribution 3T 1T SEED Encoding: Sensory Input Alphabet:Sn = {A1 , A2 , … , An } with probabilities {p1, …, pn}. Isospike Encoding:En = {burst of 1 isospike, … , burst of n isospikes} Message: SEED isospike trains… Idea Situation:1)Each spike takes up the same amount of time, T, 2) Zero inter-spike transition Then, the average time per symbol is Tave (p) = Tp1 + 2Tp2+… +nTpn And, The bit per unit time is rn(p) = E(p) / Tave (p) time Theorem: (Golden Ratio Distribution) For each nr 2 rn* = max{rn (p) | p1 + p2+… +pn= 1, pkr 0} = _ln p1 / (T ln 2) for which pk= p1k andp1 + p12 +… + p1n = 1. In particular, for n = 2,p1 = f, p2 = f2. In addition,p1(n) ½ as n . 8
Bit rate Golden Ratio Distribution Generalized Golden Ratio Distribution = Special Case: Tk = m k, Tk / T1 = k
GoldenSequence P{fat tile} f P{thin tile} f2 Golden Sequence # of 1s # of 0s Total (Rule: 110, 01) (Fn) (Fn-1) (Fn + Fn –1 = Fn +1) 1 1 0 10 1 1 101 2 1 10110 32 10110101 5 3 1011010110110 8 5 101101011011010110101 13 8 (# of 1s)/(# of 0s) = Fn /Fn-11/f, Fn+1 = Fn + Fn -1, => Distribution:1 = Fn /Fn+1 + Fn -1 /Fn+1,=>p1 f,p0 f2