580 likes | 655 Views
Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and Applications Nik Kasabov nkasabov@aut.ac.nz Knowledge Engineering and Discovery Research Institute, KEDRI www.kedri.info , AUT, NZ. Content.
E N D
Seminar, University of Lancaster, September 2005 Evolving Connectionist Systems: Methods and ApplicationsNik Kasabovnkasabov@aut.ac.nzKnowledge Engineering and Discovery Research Institute, KEDRI www.kedri.info, AUT, NZ
Content • Adaptation and interaction in biological and artificial systems • Evolving Connectionist Systems - ECOS. • ECOS for Bioinformatics • ECOS for Neuroinformatics • ECOS for Medical Decision Support • Computational Neurogenetic Modelling (CNGM) • ECOS for adaptive signal processing, speech and image • ECOS for adaptive mobile robots • ECOS for adaptive financial time-series prediction • Conclusions and future research • References N.Kasabov, 2005
Evolving process – the process s unfolding, developing, revealing, changing over time in a continuous way Evolving processes in living organisms -> five levels. Different information processing operations. Dynamic models may need to to take into account all 5 levels together and not just one of them (e.g.neural networks, evolutionary computation) Developing and applying such models to solving complex problems in bioinformatics, brain study and engineering. 5. Evolutionary development Function examples: genome evolution, creation of new individuals and species, ______________________________________________ 4. Brain level Function examples: cognition, speech and language ______________________________________________ 3. Neural network ensemble level Function examples: sound perception, signal processing ______________________________________________ 2. Single cell (e.g. neuron) level Function examples: neuron activation ______________________________________________ Molecular level Function examples : DNA translation into RNA, RNA and gene transcription into proteins • 1. Adaptation and interaction in biological and artificial systems N.Kasabov, 2005
Intelligent systems for adaptive modeling and knowledge discovery • Modelling complex processes is a difficult task • Most existing techniques may not be appropriate to model complex, dynamic processes. • Variety of methods need to be developed to be applied to a number of challenging real-world applications N.Kasabov, 2005
NeuCom: A Neuro Computing Environment for Intelligent Decision Support Systems • 60 new and old methods for data mining, data modelling and knowledge discovery; prognostic systems; web based decision support systems; decision support systems. • Free copy of a student version: www.theneucom.com • Applications: Bionformatics, Neuroinformatics, Robotics, Signal processing , speech and image, … N.Kasabov, 2005
Environment ECOS 2. Evolving COnnectionist Systems - ECOS • ECOS are modular connectionist-based systems that evolve their structure and functionality in a continuous, self-organised, on-line, adaptive, interactive way from incoming information; they can process both data and knowledge in a supervised and/or unsupervised way. • N.Kasabov, Evolving connectionist systems: methods and applications in bio-informatics, brain study and intelligent machine, Springer Verlag, 2002 • First publications – Iizuka, 98 and ICONIP’98 • ‘Throw the “chemicals” and let the system grow, is that what you are talking about, Nik ?’ Walter Freeman, UC at Berkeley, a comment at “Iizuka’”1998 conference • ‘ This is a powerful method ! Why don’t you apply it to challenging real world problems from Life sciences?’Prof. R.Hech-Nielsen, UC San Diego, a comment at “Iizuka” 1998 conference • Early examples of ECOS: • RAN (J.Platt) – evolving RBF NN • RAN with a long term memory – Abe et al, ; • Incremental FuzzyARTMAP; • Growing gas; etc. N.Kasabov, 2005
EI An information system that develops its structure and functionality in a continuous, self-organised, adaptive, interactive way from incoming information, possibly from many sources, and performs intelligent tasks typical for the human brain (e.g. adaptive pattern recognition, decision making, concept formation, languages,….). EI is characterised by: • Adaptation in an on-line, incremental, life-long mode • Fast learning from large amount of data, e.g. 'one-pass' training • Open structure, extendable, adjustable • Memory-based (add and retrieve information, delete information) • Active interaction with other systems and with the environment • Represent adequately space and time at their different scales • Knowledge-based: rules; self-improvement N.Kasabov, 2005
Five levels of functional information processing in ECOS • 1. “Genetic level” – each neuron in the system has a set of parameters – genes that are subject to adaptation through both learning and evolution • 2. Neuronal level • 3. Ensembles of neurons – evolving NNM • 4. The whole ECOS • 5. Populations of ECOS and their development over generations Different learning algorithms apply at each level, capturing different meaning (aspect) of the whole process. N.Kasabov, 2005
ECOS are based on unsupervised or supervised clustering and local, knowledge-based modelling An evolving clustering process using ECM with consecutive examples x1 to x9 in a 2D space (Kasabov and Song, 2002) N.Kasabov, 2005
rule(case) node layer: growing and shrinking outputs: not fixed, defuzzified Inputs: not fixed, fuzzified or not Evolving Fuzzy Neural Networks (EFuNNs) • Learning is based on clustering in the input space and a function estimation for this cluster • Prototype rules represent the clusters and the functions associated with them • Different types of rules: e.g. – Zadeh-Mamdani, or Takagi-Sugeno • The system grows and shrinks in a continuous way • Feed-forward and feedback connections (not shown) • Fuzzy concepts may be used • Not limited in number and types of inputs, outputs, nodes, connections • On-line/off line training • IEEE Tr SMC, 2001, N.Kasabov • ECF – evolving classifier function – a partial case of EFuNN – no output MF N.Kasabov, 2005
Local, incremental, cluster-based learning in EFuNN • Incremental (possibly on-line) supervised clustering • First layer of connections: W1(rj(t+1))=W1 (rj(t)) + lj. D(W1(rj(t)) , xf) • Second layer: W2 (rj(t+1) ) = W2(rj(t)) + lj. (A2 - yf). A1(rj(t)), where: - rj is the jth rule node (hidden node); - D – distance – fuzzy or Euclidean (normalised) - A2=f2(W2.A1) is the activation vector of the fuzzy output neurons in the EFuNN structure when x is presented; - A1(rj(t)) =f2 (D (W1 (rj(t)) , xf)) is the activation of the rule node rj (t); a simple linear function can be used for f1 and f2, e.g. A1(rj(t)) = 1- D (W1 (rj(t)) , xf)); - lj is the current learning rate of the rule node rj calculated for example as lj = 1/ Nex(rj), where Nex(rj) is the number of examples associated with rule node rj. - Example: Classification of gene expression data. N.Kasabov, 2005
Modeling, prediction and knowledge discovery from dynamic time series Publication: Kasabov, N., and Song, Q., DENFIS: Dynamic Evolving Neural-Fuzzy Inference System and its Application for Time Series Prediction, IEEE Transactions on Fuzzy Systems, 2002, April Dynamic Evolving Neuro-Fuzzy System DENFIS N.Kasabov, 2005
Local, incremental learning of cluster-based fuzzy rules in DENFIS • Input vector: x = [x1,x2, … , xq] • Result of inference: Σ i=1,m [ ωi fi ( x1, x2, …, xq )] y = __________________________ Σ i=1,mωi • A partial case is using linear regression functions: y = β0 + β1 x1 + β2 x2 + … + βq xq. • Fuzzy rules: IF x is in cluster Cj THEN yj = fj (x) • Incremental learning of the function coefficients through least square error N.Kasabov, 2005
Learning and Inference in DENFIS N.Kasabov, 2005
Evolving Multiple ECOS Models Through Evolutionary Computation Evolutionary computation. Terminology: • Gene • Chromosome • Population • Crossover • Mutation • Fitness function • Selection N.Kasabov, 2005
Genetic Algorithms N.Kasabov, 2005
GA and ECOS • Many individual ECOS are evolved simultaneously on the same data through a GA method • A chromosome represents each individual ECOS parmeters • Individuals are evaluated and the best one is selected for a further development • Mutation N.Kasabov, 2005
Feature and Parameter Optimisation of Local ECOS Models Through GA • Nature optimises its “parameters” through evolution • Replication of individual ECOS systems and selection of: • The best one • The best m averaged N.Kasabov, 2005
Presentation & Representation Part Decision Part Action Part Feature Selection Part Adaptation Environment (Critique) Action Module Higher Level Decis. Module Inputs Results New Inputs NNM NNM NNM Self analysis, Rule extraction A framework of evolving connectionist machines (ICONIP,1998) N.Kasabov, 2005
ECOS have the following characteristics • Evolving structure • Rule node creation based on similarity between data examples • Pruning nodes • Regular node aggregation • Supervised and unsupervised learning • Local learning!! • On-line learning • Life-long learning • Sleep learning • Fuzzy rule extraction • Time-space preserved and can be traced back N.Kasabov, 2005
Machine learning algorithms for ECOS N.Kasabov, 2005
Inductive (Local) versus Transductive (“Personalised”) Modelling • A transductive model is created on a sub-set of neighbouring data to each input vector. A new data vector is situated at the centre of such a sub-set (here illustrated with two of them – x1 and x2), and is surrounded by a fixed number of nearest data samples selected from the training data D and generated from an existing model M (Vapnjak) • The principle of “What is good for my neigbours will be good for me” • GA parameter optimisation – poster 1065, IJCNN05 – Mon, 7-11pm (N.Mohan, NK) ● – a new data vector ○ – a sample from D ∆ – a sample from M N.Kasabov, 2005
Incremental Classifier and Feature Selection Learning • When new data is added incrementally, new features may become important, e.g. different frequencies, different Principle Components • Incremental PCA (Hall and Martin, 1998) • Incremental, chunk PCA and ECOS modification • Examples: • S. Ozawa, S.Too, S.Abe, S. Pang and N. Kasabov, Incremental Learning of Feature Space and Classifier for Online Face Recognition, Neural Networks, August, 2005 • S. Pang, S. Ozawa and N. Kasabov, Incremental Linear Discriminant Analysis for Classification of Data Streams, IEEE Trans. SMC-B, vol. 35, No. 4, 2005 N.Kasabov, 2005
3. ECOS for Bioinformatics • DNA/ RNA sequence analysis • Gene expression profiling for diagnosis and prognosis • microRNA structure analysis and prediction • Protein structure analysis and prediction • Gene regulatory network modelling and discovery N.Kasabov, 2005
Gene Expression Data Analysis and Disease Profiling • DNA analysis - large data bases; data always being added and modified; different sources of information • Markers and drug discoveries: • Gastric cancer • Bladder cancer • CRC • www.pedblnz.com
A specialized gene expression profiling SIFTWARE based on ECOS www.peblnz.com www.kedri.info N.Kasabov, 2005
Case study on gene expression data modelling and profiling: The DLBCL Lymphoma data set (M. Ship et al, Nature Medicine, vol.8, n.1, January 2002, 68-74) N.Kasabov, 2005
Rule extraction from a trained ECOS on the DLBCL Lymphoma data set (M. Ship et al, Nature Medicine, vol.8, n.1, January 2002, 68-74) if X1 is( 2: 0.84 ) X2 is( 2: 0.81 ) X3 is( 1: 0.91 ) X4 is( 1: 0.91 ) X5 is( 3: 0.91 ) X6 is( 1: 0.89 ) X7 is () X8 is( 1: 0.91 ) X9 is( 1: 0.91 ) X10 is ( 3: 0.87 ) X11 is ( 1: 0.86 ) then Class is [1] - Fatal Where: X1,..X11 are known genes; 1- small, 2-medium, 3-large if X1 is( 2: 0.60 ) X2 is( 1: 0.73 ) X3 is( 1: 0.91 ) X4 is( 3: 0.91 ) X5 is( 1: 0.64 ) X6 is () X7 is( 2: 0.65 ) X8 is( 2: 0.90 ) X9 is( 1: 0.91 ) X10 is( 1: 0.62 ) X11 is( 1: 0.91 ) then Class is [2] - Cured N.Kasabov, 2005
Comparative Analysis of Global, Local and Personalised Modelling on the Lymphoma Gene Expression Case Study N.Kasabov, 2005
Gene regulatory network modeling and discovery • Case study: • Leukemia cell line U937 (experiments done at the NCI, NIH, Frederick, USA, Dr Dimitrov’s lab) • Two different clones of the same cell line treated with retinoic Acid • 12,680 genes expressed over time points • 4 time points (the MINUS clone, the cell died) and • 6 time points (PLUS cell line, cancer) N.Kasabov, 2005
Genes that share similar functions usually show similar gene expression profiles and cluster together Different clustering techniques: Exact clusters vs fuzzy clusters Pre-defined number of clusters or evolving Batch vs on-line Using different similarity or correlation measure Cluster analysis of time course gene expression data reduces the variables in the GRN N.Kasabov, 2005
The Goal is to Discover Gene Regulatory Networks and Gene State Transitions N.Kasabov, 2005
Gene networks and reverse engineering • GN describe the regulatory interaction between genes • DNA transcription, RNA translation and protein folding and binding – all are part of the process of gene regulation • Here we use only RNA gene expression data • Reverse engineering – from gene expression data to GN. • It is assumed that gene expression data reflects the underlying genetic regulatory network • Co-expressed genes over time – either one regulates the other, or both are regulated by same other genes • Problems: • What is the time unit? • Appropriate data needed and a validation procedure • Data is usually insufficient – we need to use special methods • Correct interpretation of the models may generate new biological knowledge N.Kasabov, 2005
Evolving fuzzy neural networks for GN modeling (Kasabov and Dimitrov, ICONIP, 2002) G(t) EFuNN G(t+dt) • On-line, incremental learning of a GN • Adding new inputs/outputs (new genes) • The rule nodes capture clusters of input genes that are related to the output genes • Rules can be extracted that explain the relationship between G(t) and G(t+dt), e.g.: IF g13(t) is High (0.87) and g23(t) is Low (0.9) • THEN g87 (t+dt) is High (0.6) and g103(t+dt) is Low • Playing with the threshold will give stronger or weaker patterns of relationship N.Kasabov, 2005
Rules extracted are turned into state transition graphs PLUS cell line MINUS cell line N.Kasabov, 2005
Using DENFIS (Dynamic, evolving neuro-fuzzy inference system) for GRN modeling (IEEE Trans. FS, April, 2002) • DENFIS vs EFuNN • G(t) gj(t+dt) • Dynamic partitioning of the input space • Takagi-Sugeno fuzzy rules, e.g.: If X1 is ( 0.63 0.70 0.76) and X2 is ( 0.71 0.77 0.84) and X3 is ( 0.71 0.77 0.84) and X4 is ( 0.59 0.66 0.72) and then Y = 1.84 - 1.26 X1 - 1.22X2 + 0.58X3 - 0.03 X4 N.Kasabov, 2005
Using a GRN model to predict the expression of genes in a future time N.Kasabov, 2005
4. ECOS for Neuroinformatics • Why ECOS for brain study? • Modeling brain states of individuals or group of individuals from EEG, fMRI and other information • Example: epileptic seizure of a patient; 8 EEG channels data is shown
EEG data and modeling perception • Standard EEG electrode systems • In the experiment here, four classes of brain perception states are used with 37 single trials each of them including the following stimuli: • Class1 - Auditory Stimulus; • Class2 - Visual Stimulus; • Class3 - Mixed Auditory and visual stimuli; • Class 4 - No stimulus. • (With van Leewen, RIKEN, BSI, Tokyo) N.Kasabov, 2005
Stimulus A V AV No Accuracy A 81.2 1.3 0.1 0.2 98% V 1.1 82.4 2.9 1.8 93.4% AV 0.6 3.3 75 1.4 93.4% No 0.4 1.5 1.3 80.5 96.2% ECF for building individual cognitive models Table 2. The correctly recognized data samples by an ECF model trained on 80% and tested on 20% of a single person’s data (person A) – 65 variables : 64 electrodes plus time N.Kasabov, 2005
5. Medical decision support systems • Large amount of data available in the clinical practice • The need for intelligent decision support systems - a market demand • Web-based learning and decision support systems • Palmtops can be used to download and run an updated decision support system • Examples: Cardio-vescular risk analysis; Trauma data analysis and prognosis; Sarcoma prognostic systems • N.Kasabov, R.Walton, et al, AI in Medicine, submitted, 2005
The case study on GFR prediction for renal medical decision support A real data set from a medical institution is used here for experimental analysis. The data set has 447 samples, collected at hospitals in New Zealand and Australia. Each of the records includes six variables (inputs): age, gender, serum creatinine, serum albumin, race and blood urea nitrogen concentrations, and one output - the glomerular filtration rate value (GFR). All experimental results reported here are based on 10-cross validation experiments with the same model and parameters and the results are averaged. In each experiment 70% of the whole data set is randomly selected as training data and another 30% as testing data. N.Kasabov, 2005
Adaptive Renal Function Evaluation System: GFR-ECOS(Song, Ma, Marshal, Kasabov, Kidney International 2005) N.Kasabov, 2005
Model Neurons or rules Testing RMSE Testing MAE Weights of input variables Age Sex SCr Surea Race Salb w1 w2 w3 w4 w5 w6 MDRD __ 7.74 5.88 1 1 1 1 1 1 MLP 12 8.44 5.75 1 1 1 1 1 1 ANFIS 36 7.49 5.48 1 1 1 1 1 1 DENFIS 27 7.29 5.29 1 1 1 1 1 1 TNFI 6.8 (average) 7.31 5.30 1 1 1 1 1 1 TWNFI 6.8 (average) 7.11 5.16 0.89 0.71 1 0.92 0.31 0.56 Comparative Analysis of Global, Local and Personalised Modelling on the Case Study of GFR Decision Support N.Kasabov, 2005
Input variables Age 58.9 Sex Female SCr 0.28 Surea 28.4 Face White Salb 38 Weights of input variables (TWNFI) 0.91 0.73 1 0.82 0.52 0.46 Results GFR (desired) 18.0 MDRD 14.9 TWRBF 16.6 A GFR personalised model of a patient obtained with the use of the TWNFI N.Kasabov, 2005
6. Computational Neurogenetic Modelling (CNGM) N.Kasabov, 2005
What is going on in a neuron? Neuron N.Kasabov, 2005
Computational Neurogenetic Modelling CNGM as a SNN GRN N.Kasabov, 2005
Gene Regulatory Network in a neuronal cell c-jun mGluR3 GABRA GABRB AMPAR NMDAR NaC KC C1C Jerky BDNF FGF-2 IGF-I GALR1 NOS S100beta Sigmoid [-1,1] S Output Input • 10 genes related to neuronal parameters and some specific genes • Linear vs non-linear gene interaction • Initial Gene Expression Values: random [-0.1,0.1] • Initial Weight Matrix : random [-1,1] with constraints
A CNGM of a spiking neuron (IJCNN 2005) N.Kasabov, 2005