340 likes | 461 Views
The Liquid Brain. Chrisantha Fernando & Sampsa Sojakka . Motivations. Only 30,000 genes, ≈ 10 11 neurons Attractor neural networks, Turing machines Problems with classical models Often depend on synchronization by a central clock
E N D
The Liquid Brain Chrisantha Fernando & Sampsa Sojakka
Motivations • Only 30,000 genes, ≈1011 neurons • Attractor neural networks, Turing machines • Problems with classical models • Often depend on synchronization by a central clock • Particular recurrent circuits need to be constructed for each task • Recurrent circuits often unstable and difficult to regulate • Lack parallelism • Real organisms cannot wait for convergence to an attractor • Wolfgang Maass’ invented the Liquid State Machine (a model of the cortical microcircuit) in which he viewed the network as a liquid (or liquid-like dynamical system).
Liquid State Machine (LSM) • Maass’ LSM is a spiking recurrent neural network which satisfies two properties • Separation property (liquid) • Approximation property (readout) • LSM features • Only attractor is rest • Temporal integration • Memoryless linear readout map • Universal computational power: can approximate any time invariant filter with fading memory • It also does not require any a-priori decision regarding the ``neural code'' by which information is represented within the circuit.
Maass’ Definition of the Separation Property The current state x(t) of the microcircuit at time t has to hold all information about preceding inputs. Approximation Property Readout can approximate any continuous function f that maps current liquid states x(t) to outputs v(t).
We took the metaphor seriously and made the real liquid brain shown below. WHY?
BECAUSE. • Real water is computationally efficient. Maass et al. used a small recurrent network of leaky integrate-and-fire neurons • But it was computationally expensive to model. • And I had to do quite a bit of parameter tweaking. • Exploits real physical properties of water. • Simple local rules, complex dynamics. • Potential for parallel computation applications. • Educational aid, demonstration of a physical representation that does computation. • Contributes to current work on computation in non-linear media, e.g. Adamatsky, Database search.
8 motors, glass tray, overhead projector Web cam to record footage at 320x240, 5fps Frames Sobel filtered to find edges and averaged to produce 700 outputs 50 perceptrons in parallel trained using the p-delta rule Pattern Recognition in a Bucket
2 motors, 1 minute footage of each case, 3400 frames Readouts could utilize wave interference patterns
Objective: Robust spatiotemporal pattern recognition in a noisy environment • 20+20 samples of 12kHz pulse-code modulated wave files (“zero” and “one”), 1.5-2 seconds in length • Short-Time Fourier transform on active frequency range (1-3000Hz) to create a 8x8 matrix of inputs from each sample (8 motors, 8 time slices) • Each sample to drive motors for 4 seconds, one after the other
One Zero
Conclusion • Properties of a natural dynamical system (water) can be harnessed to solve non-linear pattern recognition problems. • Set of simple linear readouts suffice. • No tweaking of parameters required. • Further work will explore neural networks which exploit the epigenetic self-organising physical properties of materials.
Acknowledgements • Inman Harvey • Phil Husbands • Ezequiel Di Paolo • Emmet Spier • Bill Bigge • Aisha Thorn, Hanneke De Jaegher, Mike Beaton. • Sally Milwidsky