150 likes | 310 Views
The Neural Network Objects RhoNNO: A ROOT package for general use. Marcel Kunze Institut für Experimentalphysik 1 Ruhr-Universität Bochum. Why ?. Neural Network Objects are around us and tested in various applications since several years (“NNO”)
E N D
The Neural Network ObjectsRhoNNO: A ROOT package for general use Marcel Kunze Institut für Experimentalphysik 1 Ruhr-Universität Bochum 3rd ROOT Workshop, M. Kunze
Why ? • Neural Network Objects are around us and tested in various applications since several years (“NNO”) • ROOT seems very appealing wrt. object persistence and interactivity • NNO has been overhauled to re-use ROOT functionality (“RhoNNO”) • Integration of other ROOT applications (Neural Network Kernel of J.P.Ernenwein) • RhoNNO complies to ROOT‘s coding standards 3rd ROOT Workshop, M. Kunze
Neural Network Models • Implementation of the most popular supervised and unsupervised network models 3rd ROOT Workshop, M. Kunze
Growing Neural Gas (GNG) Example: Adaptation to a multi-dimensional PDF (1D-2D-3D) Fractal growth process: Density and topology conservation 3rd ROOT Workshop, M. Kunze
Architecture • RhoNNO class hierarchy 3rd ROOT Workshop, M. Kunze
Interactive Control Positive Samples (“pro”) Negative Samples (“con”) • The plotter highlights training progress - Output functions- Error functions 3rd ROOT Workshop, M. Kunze
Management of Data Sets • TDataServeA mini database to support management of input/output vector relations- Add vector pairs to data sets- Partition data sets for training/testing- Shuffle data sets- Serve vector pairs - Save and load data sets using ROOT persistence 3rd ROOT Workshop, M. Kunze
// Abstract interface for all networks virtual void AllocNet() = 0; virtual void InitNet() = 0; virtual void WriteText() = 0; virtual void WriteBinary() = 0; virtual void ReadText() = 0; virtual void ReadBinary() = 0; virtual Double_t* Recall(NNO_INTYPE* in,NNO_OUTTYPE* out=0) = 0; virtual Double_t Train(NNO_INTYPE* in,NNO_OUTTYPE* out=0) = 0; // Training and testing Double_t TrainEpoch(TDataServe *server, Int_t nEpoch=1); Double_t TestEpoch(TDataServe *server); void BalanceSamples(Bool_t yesNo = kTRUE); virtual void SetMomentumTerm(Double_t f); virtual void SetFlatSpotElimination(Double_t f); VNeuralNet Interface • Abstract base class of all network models 3rd ROOT Workshop, M. Kunze
NetworkTrainer • A RhoNNO sample application to train and test neural networks- Assemble training and test data sets out of ROOT trees, based on TFormula- Define network architecture and transfer functions- Define and execute a training schedule- Persist networks- Generate C++ code to perform network recall 3rd ROOT Workshop, M. Kunze
Example: PID Tagging Qc e • Identify charged tracks using dE/dx, QC etc. m p K p P [GeV/c] P [GeV/c] 3rd ROOT Workshop, M. Kunze
The ROOT Training File • Arbitrary standard tree to provide training and test sample Measurement and shape variables MC-Truth: 1 = electron 2 = muon 3 = pion 4 = kaon 5 = proton Likelihood values 3rd ROOT Workshop, M. Kunze
# Example: Training of PIDSelectors with NNO #define the network topology and training schedule xmlp 7 15 10 1 # MLP with 2 hidden layers transfer TR_FERMI # Transfer function momentum 0.2 # Momentum term balance true # Assure same statistics for pro and con samples plots true # Show updating error plots on training progress test 10000 # Number of test vector pairs to reserve start 1 # First training epoch stop 200 # Last training epoch #define the data source, take two input files datapath ../Data # Directory to look up data files networkpath ../Networks # Directory to persist network files file PidTuple1.root # First file to get input from file PidTuple2.root # Second … (ad infinitum) #set up the input layer (use branch names) tree PidTuple # This is the tree to look up data cut mom>0.5&&dch>0&&dch<10000 # Preselection of samples input mom:acos(theta):svt:emc:drc:dch:ifr #Input layer autoscale true # Apply a scale to assure inputs are O(1) #set up the output layer (use branch names) #Particles pid = {electron=1,muon,pion,kaon,proton} output abs(pid)==3 # Perform training for pions Steering File x 3rd ROOT Workshop, M. Kunze
// TXMLP network trained with NNO NetworkTrainer at Fri Apr 27 // Input parameters mom:acos(theta):svt:emc:drc:dch:ifr // Output parameters abs(pid)==3 // Training files: //../Data/PidTuple1.root //../Data/PidTuple2.root #include "RhoNNO/TXMLP.h" Double_t* Recall(Double_t *invec) { static TXMLP net("TXMLP.net"); Float_t x[7]; x[0] = 0.76594 * invec[0]; // mom x[1] = 2.21056 * invec[1]; // acos(theta) x[2] = 0.20365 * invec[2]; // svt x[3] = 2.2859 * invec[3]; // emc x[4] = 1.75435 * invec[4]; // drc x[5] = 0.00165 * invec[5]; // dch x[6] = 0.85728 * invec[6]; // ifr return net.Recall(x); } Generation of Recall Code 3rd ROOT Workshop, M. Kunze
Execute the Example • NetworkTrainer <steering file> <first epoch> <last epoch> 3rd ROOT Workshop, M. Kunze
Summary • The re-use of ROOT functionality improves NNO a lot • TFormula works great to pre-process samples • Run training either from CINT C++ or from ASCII steering file • A RhoNNO GUI and/or Wizard is still missing • RhoNNO comes as part of the Rho package, but can be used independently: Installation of the shared lib plus the headers is sufficient. • The NetworkTrainer application runs standalone • Documentation:http://www.ep1.ruhr-uni-bochum.de/~marcel/RhoNNO.html 3rd ROOT Workshop, M. Kunze