110 likes | 237 Views
NEU Neural Computing. MSc Natural Computation Department of Computer Science University of York. Module description. “ provides a foundation of theoretical and practical knowledge in the subject of neural systems ” Algorithms inspired by natural neural systems
E N D
NEUNeural Computing MSc Natural Computation Department of Computer Science University of York
Module description • “provides a foundation of theoretical and practical knowledge in the subject of neural systems” • Algorithms inspired by natural neural systems • Biological (natural) neural systems and the principal artificial neural architectures • Emphasis will be on the characterisation of the artificial systems, rather than the analysis of their properties in statistical terms. • ..so no statistical learning theory!
Learning outcomes • On completion of this module students will be able to • Identify which neural system is suitable for a particular task. • Design, implement and experiment with neural architectures for a particular task. • Design appropriate encodings of data. • Evaluate the application of a particular architecture to a given problem.
Who is it aimed at? • Basic computer science experience of algorithms and complexity will be assumed • No biological background will be necessary. • Some discussion later of “realistic” neuron models, but not in depth
Level of mathematics required • Calculus, matrices and vectors • If you can follow these, that’s good • If you can’t, some bits of theory will be missing
Content 1: Biological networks Cerebellar network
Content 2: Feed forward networks • Start with the simplest system – one neuron performing one operation – what can it do? • We can make more complex arrangements of neurons, in which we have layers with connections from one layer to the next – what does this add to their capabilities? • We can also change the operation of the neuron • How do we decide on the architecture for a given problem?
Content 3: Recurrent networks • Instead of a flow from inputs to outputs, we can have more arbitrary (or complete) connections – the flow of information can be around a loop = recurrent or dynamic • Designate some nodes as inputs and others as outputs, or all nodes are inputs at one time and outputs at a later time • What sort of behaviour do we get from recurrent networks? • What are the issues with storage and stability?
Content 4: Spiking networks • So far we have though about signals in and out – voltage, current, or just numbers • In reality, neurons are not quite like that. One difference is in spiking behaviour Spatio-temporal pulse pattern. The spikes of 30 neurons (A1-E6, plotted along the vertical axes) are shown as a function of time (horizontal axis, total time is 4 000 ms). The firing times are marked by short vertical bars. From Krüger and Aiple (1988).
Practical elements • In addition to lecture material, there will be exercises to do in your own time. • That will usually require some work with MATLAB to model simple neural systems. • We won’t be using the MATLAB Neural Network toolbox, because that hides the details of the algorithms
Assessment • The assessment for the module is open • The assessment will consist of some or all of the following: • Demonstration of understanding of lecture material • Selection and application of algorithms to given datasets • Analysis of the output of specific algorithms • Review of the literature on a particular topic.