750 likes | 855 Views
Structure learning with deep neuronal networks. 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl. Agenda. Autoencoders. Dataset. Model. x 2. x 1. Real world data usually is high dimensional …. x 1. x 2. Autoencoders. Dataset. Model. x 2. ?. x 1.
E N D
Structure learningwith deep neuronal networks 6th Network Modeling Workshop, 6/6/2013 Patrick Michl
Autoencoders Dataset Model x2 x1 Real worlddatausuallyishigh dimensional …
x1 • x2 Autoencoders Dataset Model x2 ? x1 … whichmakesstructuralanalysisandmodelingcomplicated!
Autoencoders Dataset Model PCA x2 x1 DimensionalityreductiontechinqueslikePCA …
x1 • x2 Autoencoders Dataset Model PCA x2 x1 … can not preservecomplexstructures!
Autoencoders Dataset Model x2 x1 Thereforetheanalysisofunknownstructures…
x1 • x2 Autoencoders Dataset Model x2 x1 … needsmoreconsideratenonlineartechniques!
Autoencoders inputdataX Autoencoder • Artificial Neuronal Network Perceptrons Gaussian Units outputdataX‘ Autoencoders areartificial neuronal networks …
Autoencoders inputdataX Perceptron Autoencoder 1 • Artificial Neuronal Network 0 Perceptrons Gaussian Units Gauss Units R outputdataX‘ Autoencoders areartificial neuronal networks …
Autoencoders inputdataX Autoencoder • Artificial Neuronal Network Perceptrons Gaussian Units outputdataX‘ Autoencoders areartificial neuronal networks …
Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ … withmultiple hiddenlayers.
Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.
Autoencoders inputdataX Autoencoder • Artificial Neuronal Network • Multiple hiddenlayers Perceptrons (Hidden layers) Definition (deepnetwork) Deepnetworksareartificial neuronal networkswith multiple hiddenlayers Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.
Autoencoders inputdataX Autoencoder • Deepnetwork Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Such networksarecalleddeepnetworks.
Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ Autoencoders have a symmetrictopology …
Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology Perceptrons (Hidden layers) Gaussian Units (Visible layers) outputdataX‘ … with an oddnumberofhiddenlayers.
Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology • Information bottleneck Bottleneck outputdataX‘ The smalllayer in thecenterworkslika an informationbottleneck
Autoencoders inputdataX Autoencoder • Deepnetwork • Symmetrictopology • Information bottleneck Bottleneck outputdataX‘ ... thatcreates a low dimensional codeforeach sample in theinputdata.
Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder outputdataX‘ The upperstackdoestheencoding …
Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder • Decoder Decoder outputdataX‘ … andthelowerstackdoesthedecoding.
Autoencoders inputdataX Autoencoder Encoder • Deepnetwork • Symmetrictopology • Information bottleneck • Encoder • Decoder Definition (autoencoder) Definition (deepnetwork) Autoencoders aredeepnetworkswith a symmetrictopologyand an oddnumberofhiddernlayers, containing a encoder, a low dimensional representationand a decoder. Deepnetworksareartificial neuronal networkswith multiple hiddenlayers Decoder outputdataX‘ … andthelowerstackdoesthedecoding.
Autoencoders inputdataX Autoencoder Problem: dimensionalityofdata Idea: Train autoencodertominimizethedistancebetweeninputXandoutputX‘ EncodeXtolow dimensional codeY Decodelow dimensional codeYtooutputX‘ Output X‘ islow dimensional outputdataX‘ Autoencoders canbeusedtoreducethedimensionofdata …
Autoencoders inputdataX Autoencoder Problem: dimensionalityofdata Idea: Train autoencodertominimizethedistancebetweeninputXandoutputX‘ EncodeXtolow dimensional codeY Decodelow dimensional codeYtooutputX‘ Output X‘ islow dimensional outputdataX‘ … ifwecantrainthem!
Autoencoders inputdataX Autoencoder Training Backpropagation outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.
Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) • The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction error outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.
Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) • The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunctionExample (linear neuronal unitwithtwoinputs) outputdataX‘ In feedforward ANNs backpropagationisthechoice
Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction Bycalculatingweget a vectorthatshows in a directionwhichdecreasestheerror We update theparameterstodecreasetheerror outputdataX‘ In feedforward ANNs backpropagationis a goodapproach.
Autoencoders inputdataX Autoencoder Training Backpropagation Backpropagation Definition (autoencoder) The distance (error) betweencurrentoutputX‘andwantedoutputYiscomputed. This gives a errorfunction Bycalculatingweget a vectorthatshows in a directionwhichdecreasestheerror We update theparameterstodecreasetheerror Werepeatthat outputdataX‘ In feedforward ANNs backpropagationisthechoice
Autoencoders inputdataX Autoencoder Training BackpropagationProblem: DeepNetwork outputdataX‘ … theproblemarethe multiple hiddenlayers!
Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining outputdataX‘ Backpropagationisknowntobeslowfarawayfromtheoutputlayer …
Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution outputdataX‘ … andcanconvergetopoorlocalminima.
Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution outputdataX‘ The taskistoinitializetheparameterscloseto a goodsolution!
Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining outputdataX‘ Thereforethetrainingofautoencodershas a pretrainingphase …
Autoencoders inputdataX Autoencoder Training • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)
Autoencoders inputdataX Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)
Autoencoders inputdataX Markov Random Field Every unitinfluenceseveryneighbor The couplingisundirected Motivation (Ising Model) A setofmagneticdipoles (spins) isarranged in a graph (lattice) whereneighborsare coupledwith a given strengt Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)
Autoencoders inputdataX Autoencoder Training • Restricted Boltzmann Machine • RBMs areMarkov Random Fields • Bipartitetopology: visible(v), hidden(h) • Uselocalenergytocalculatetheprobabilitiesofvalues • Training: • contrastivedivergency • (Gibbs Sampling) v3 v4 v1 v2 • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines h1 h2 h3 outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)
Autoencoders inputdataX Autoencoder Training Restricted Boltzmann Machine Gibbs Sampling • BackpropagationProblem: Deep Network • Veryslowtraining • Maybebadsolution • Idea: Initialize closeto a goodsolution • Pretraining • Restricted Boltzmann Machines outputdataX‘ … whichusesRestricted Boltzmann Machines (RBMs)
Autoencoders Autoencoder Training Top The top layer RBM transformsreal valuedataintobinarycodes.
Autoencoders Autoencoder Training v3 v4 v1 v2 Top h2 h3 h4 h5 h1 Thereforevisibleunitsaremodeledwithgaussianstoencodedata …
Autoencoders Autoencoder Training v3 v4 v1 v2 Top h2 h3 h4 h5 h1 … andmanyhiddenunitswithsimoidstoencodedependencies
Autoencoders Autoencoder Training v3 v4 v1 v2 Top LocalEnergy h2 h3 h4 h5 h1 The objectivefunctionisthesumofthelocalenergies.
Autoencoders Autoencoder Training Reduction The next RBM layermapsthedependencyencoding…
Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction v h1 h2 h3 … fromtheupperlayer …
Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction h h1 h2 h3 … to a smallernumberofsimoids …
Autoencoders Autoencoder Training v3 v4 v1 v2 Reduction LocalEnergy h1 h2 h3 … whichcanbetrainedfasterthanthe top layer
Autoencoders Autoencoder Training Unrolling The symmetrictopologyallowsustoskipfurthertraining.
Autoencoders Autoencoder Training Unrolling The symmetrictopologyallowsustoskipfurthertraining.
Autoencoders Autoencoder Training • PretrainingTop RBM (GRBM)Reduction RBMsUnrolling • FinetuningBackpropagation After pretrainingbackpropagationusuallyfindsgoodsolutions