390 likes | 513 Views
Computers - Using your Brains. Jim Austin Professor of Neural Computation. Pentium III. So how complex is it ? 10 12 neurons … 1,000,000,000,000 1000 connections between neurons. One brain can hold ... 1,000,000,000,000,000 numbers !. What do 10 12 neurons look like ?.
E N D
Computers - Using your Brains Jim Austin Professor of Neural Computation
So how complex is it ? • 1012 neurons … 1,000,000,000,000 • 1000 connections between neurons. • One brain can hold ... 1,000,000,000,000,000 numbers!
What do 1012 neurons look like ? • 1600 times Population of the world (6,100,000,000) • 78,125 times the complexity of the Pentium III • Equal to the number of stars in our galaxy 4 Meters 4 Meters 4 Meters Sand
What are computers good at ? • Adding up fast • Storing data - numbers and facts • Pushing data around • What are computers bad at ? • Being reliable • Finding information - knowledge • Doing very complex things - recognizing images • Learning to do the job them selves! The good and the bad
Neurons verses Gates Input 1 Output Input 2 NAND Gate Boolean Logic - both inputs OK, output not OK
= = Gates - NAND ALL inputs to be OK for output to be NOT OK Output Input 1 Input 2 =
Evolution ? Should have picked a NAND gate for the brain...
Neuron Output = threshold (input A x weight A + input B x weight B) A + Inputs Output B “Weights” Threshold logic - threshold 1 - one or more inputs OK output OK
= = Neuron At least one OK for output to be OK = At least three OK’s for output to be OK
Can also alter connections/importance of inputs using the weights on the inputs Weights 1 0 1 1 + 3.5 0.5 1 1
Why did this difference develop ? • “The analysis of the operation of a machine using two indication elements and signals can be conveniently be expressed in terms of a diagrammatic notation introduced, in this context, by Von Neuman and extended by Turing. This was adopted from a notation used by Pits and McCulloch as a possible way of analyzing the operation of the nervous system,…” Calculating Instruments & Machines, D Hartree, 1950, Cambridge University Press. • Probably dropped due to the development of the silicon chip • simpler to build Boolean logic gates rather than neuron units.
n z k inputs Functional elements. Threshold n gate k n 1 z Excitation, “OR” 2 z Excitation, “AND”
ICT Orion Computer • Used ‘Neuron’ logic - 1962
Learning ! Learning at neuron level = Adjustment of which inputs are important Conventional computers have no implicit learning ability
+ Happy + Hungry Threshold = 2
+ Happy + Hungry
+ Happy + hungry
Can we build useful systems with neurons ? Better tolerance to failure Parallelism/use of threshold logic/distributed memory Faster operation Massive parallelism Better access to uncertain information Threshold logic/neurons Where the inputs are uncertain Threshold logic/neurons. Where we want low power Asynchronous systems Adaptability Use of weights and learning methods.
So what have we done with these ? Cortex-1 28 Processor cards, each holding 128 hardware neurons. Each with 1,000,000,000 weights. 16MHz. PCI based card.
Complete Machine: 400,000,000 neuron evaluations per second 28,000 inputs 30 bits set on input 1,000,000 neurons.
Cortex-1 node 5,120,000,000 neuron weights, 640 neurons.
Text search engines • Tolerant to spelling errors. • Finds similar words to those supplied, for example chair, seat, bench. • Learns these similarities automatically from text. • Uses neural engine for document storage. • Estimated 400,000,000 documents searched per second.
Molecular Databases • One of few systems that deal with the full 3D molecule
Query Good matches Bad Match
Thanks... Aaron Turner Mick Turner Vicky Hodge Julian Young Anthony Moulds Zyg Ulanowski Ken Lees Michael Weeks Sujeewa Alwis John Kennedy David Lomas and many others …. (It’s Brains from Thunderbirds !)