530 likes | 687 Views
Lecture series: Data analysis. Thomas Kreuz , ISC, CNR thomas.kreuz@cnr.it http://www.fi.isc.cnr.it/users/thomas.kreuz /. Lectures: Each Tuesday at 16:00 (First lecture: May 21, last lecture: June 25 ). Schedule. Lecture 1: Example (Epilepsy & spike train synchrony),
E N D
Lecture series: Data analysis Thomas Kreuz, ISC, CNR thomas.kreuz@cnr.it http://www.fi.isc.cnr.it/users/thomas.kreuz/ Lectures: Each Tuesday at 16:00 (First lecture: May 21, last lecture: June 25)
Schedule • Lecture 1: Example (Epilepsy & spike train synchrony), • Data acquisition, Dynamical systems • Lecture 2: Linear measures, Introduction to non-linear • dynamics • Lecture 3: Non-linear measures • Lecture 4: Measures of continuous synchronization • Lecture 5: Measures of discrete synchronization • (spike trains) • Lecture 6: Measure comparison & Application to epileptic • seizure prediction
Overview of lecture series • Introduction to data / time series analysis • Univariate:Measures for individual time series • - Linear time series analysis: Autocorrelation, Fourier spectrum • - Non-linear time series analysis: Entropy, Dimension, Lyapunov exponent • Bivariate:Measures for two time series • - Measures of synchronization for continuous data (e.g., EEG) • cross correlation, coherence, mutual information, phase synchronization, • non-linear interdependence • - Measures of directionality: Granger causality, transfer entropy • - Measures of synchronization for discrete data (e.g., spike trains): • Victor-Purpura distance, van Rossum distance, event synchronization, • ISI-distance, SPIKE-distance • Applications to electrophysiological signals • (in particular single-unit data and EEG from epilepsy patients) • Epilepsy – “window to the brain”
First lecture • Example: Epileptic seizure prediction • Data acquisition • Introduction to dynamical systems
Second lecture • Non-linear model systems • Linear measures • Introduction to non-linear dynamics • Non-linear measures • - Introduction to phase space reconstruction • - Lyapunov exponent
Third lecture Non-linear measures - Dimension [ Excursion: Fractals ] - Entropies - Relationships among non-linear measures
Fourth lecture • Motivation • Measures of synchronization for continuous data • Linear measures: Cross correlation, coherence • Mutual information • Phase synchronization (Hilbert transform) • Non-linear interdependences • Measure comparison on model systems • Measures of directionality • Granger causality • Transfer entropy
Fifth lecture • Motivation and examples • Measures of synchronization for discrete data • (here: spike trains, but in principle can be any other kind of discrete data) • Victor-Purpura distance • Van Rossum distance • Schreiber correlation measure • ISI-distance • SPIKE-distance (& Applications)
Spikes / Spike trains • Spike: Action potential (event in which the membrane potential of a neuron rapidly rises and falls.) • Spike train: Temporal sequence of spikes. • Basic assumptions: • All-or-non law: “There is no such thing as half a spike.” • Either full response or no response at all • (depending on whether firing threshold is crossed or not) • Spikes are stereotypical. Shape does not carry information. • Background activity carries minimal information. • Only spike times matter.
Motivation: Spike train (dis)similarity • Three different scenarios: • 1. Simultaneous recording of population • Neuronal correlations, pathology (e.g. epilepsy) • 2. Repeated presentation of just one stimulus • Reliability • 3. Repeated presentation of different stimuli • Stimulus discrimination, neural coding
1. Simultaneous recording: Example • Monkey retina (functioning in vitro for ~ 15h) • Multi-Electrode Array (MEA) recordings (512 electrodes) • Complete populations of retinal ganglion cells (~ 100 RGCs)
2. Repeated stimulus presentation: Example One neuron, 60 repetitions: High reliability # Trial
3. Different stimuli: Neural coding • Neural coding: • Relationship between the stimulus and the individual or ensemble neuronal responses • Neural encoding: Map from stimulus to response • Aim: Response prediction • Neural decoding: Map from response to stimulus • Aim: Stimulus reconstruction Stimulus Encoding Decoding Response
Neural coding schemes Labelled line coding: Individual neurons code on their own. Identity of neuron that fires a spike matters. Population coding: Joint activities of a number of neurons. Identity of the neuron is irrelevant. All that is important is that the spike is fired as part of the population response, not which neuron fired it. Advantages: Individual neurons are noisy, summed population is robust. Multi-coding possible. Faster. See also: Sparseness vs. distributed representation in memory and recognition Extreme sparseness: Grandmother cell Jennifer Aniston neuron (concept cell)
Jennifer Aniston neuron [QuianQuiroga et al. Nature (2005)]
Sensory-motor system: Cortical homunculus Primary somatosensory cortex Primary motor cortex [Wilder Penfield: Epilepsy and the Functional Anatomy of the Human Brain. 1954]
Neural coding schemes Rate coding: Most (if not all) information about the stimulus is contained in the firing rate of the neuron Edgar Adrian 1929 (NP 1932): Firing rate of stretch receptor neurons in the muscles is related to the force applied to the muscle. Temporal coding: Precise spike timing carries information Many studies: Temporal resolution on millisecond time scale No absolute time reference in the nervous system Relative timing to stimulus onset / other spikes, but also with respect to ongoing brain oscillation (special cases: Latency code, Pattern code, Coincidence code)
Measures of spike train (dis)similarity • Victor-Purpura distance (Victor & Purpura, 1996) • van Rossum distance (van Rossum, 2001) • Event synchronization (QuianQuiroga et al., 2002) • Schreiber correlation measure(Schreiber et al., 2003) • Hunter-Milton similarity (Hunter & Milton, 2003) • ISI-distance(ISI = Inter-spike interval) (Kreuz et al., 2007) • SPIKE-distance (Kreuz et al., 2013) Overview and comparison: Kreuz T, Haas J, Morelli A, Abarbanel HDI, PolitiA: Measuring spike train synchrony. JNeurosci Methods 165, 151 (2007) Kreuz T, Chicharro D, Houghton C, Andrzejak RG, MormannF: Monitoring spike train synchrony. JNeurophysiol109, 1457 (2013)
Motivation: SPIKE-distance ISI- Distance SPIKE- Distance
Representations • Dissimilarity matrix of size N^2 * #(t): • Full representation (as seen in movie) • Instantaneous dissimilarity (one frame of movie) • Temporal averaging (selective, triggered) • Spatial averaging - Synchronization among spike train • groups (or full population Measure profile) • Temporal and spatial averaging: Overall synchrony
Advantages • Perfect time resolution, no binning, no parameter • Not invariant to shuffling of spikes among spike trains • (in contrast to peri-stimulus time histogram, PSTH) • Time-scale independence • Computational efficiency • Online monitoring (Real-time SPIKE-distance) • Applications: - Epilepsy • - Brain-machine interfacing • Application to continuous data (e.g. EEG) • Papers and Matlab source codes: • http://www.fi.isc.cnr.it/users/thomas.kreuz/sourcecode.html
Today’s lecture • Comparison of spike train distances • Capability to reproduce known clustering • Comparison of continuous measure of synchronization • Application to epileptic seizure prediction • Predictive performance • Statistical validation • Secondary time series analysis / Analysis of measure profiles • The method of measure profile surrogates
Measure comparison
Validation: Hindemarsh-Rose simulations - Associate neuronal network („Black box“) - Time series from 29 neurons (each 32768 points) - Two synaptically coupled clusters of 13 neurons (1 and 2), remaining 3 neurons are coupled to all other (shared,S)
Reminder: Victor & Purpuradistance DV • Minimum cost DVof transforming one train into the other • Only three possible transformations: • - Adding a spike (cost 1) • - Deleting a spike (cost 1) • - Shifting a spike (Parameter:Cost cV) • Low cV: DV ~ Difference in spike count (rate code distance) • High cV: DV ~ # non-aligned spikes (coincidencedistance) [Victor & Purpura, J Neurophysiol 76, 1310 (1996)]
Hierarchical cluster tree (dendrogram): DI • Single linkage algorithm • First, the closest pair of spike trains is identified and thereby linked by a П-shaped line, where the height of the connection measures the mutual distance . • These two spike trains are merged into a single element , • and the next closest pair of elements is then identified and • connected. • The procedure is repeated iteratively until a single cluster • remains. • Distance between a pair of clusters:
Assessing cluster quality • Confusion matrix : # spike trains from cluster classified as belonging to cluster • Correct clustering: diagonal • Quantification: Normalized confusion entropy • For H=1: Cluster separation
Epileptic seizure prediction