360 likes | 500 Views
reconstructed drift chamber triplets (MC data). Introduction to the HARP Software. Contents. Account setup: Environment variables, aliases, scripts and directories Packages Domains Diagram Getting and compiling a Package jobOptions GAUDI Stores Message Service Histogram Service
E N D
reconstructed drift chamber triplets (MC data) Introduction to the HARP Software
Contents • Account setup: Environment variables, aliases, scripts and directories • Packages • Domains Diagram • Getting and compiling a Package • jobOptions • GAUDI • Stores • Message Service • Histogram Service • Reconstruction • NdcTriplet • Example – Making an Ntuple from the NdcTriplets • The Algorithm source, jobOptions, compiling and running • Homework HARP SoftwareMalcolm Ellis
Account Setup • .cshrc (or equivalent), it is necessary to define some aliases and source some standard HARP scripts: • setenv HARP /afs/cern.ch/exp/harp • setenv HARP1 $HARP/user1 • setenv HARP2 $HARP/user2 • setenv PACKAGES $HARP2/mellis/Packages • source /afs/cern.ch/exp/harp/Software/SCRIPTS/CMT.csh • setenv CMTCONFIG i386_linux22 • setenv CMTPATH ${CMTPATH}:${HARP_DEV} • A directory is needed to store the different packages that you will use. This can be created in $HARP1 or $HARP2: • eg. $HARP2/mellis/Packages • A link from the home directory is needed called “mycmt”: • cd • ln -s $HARP2/mellis/Packages mycmt HARP SoftwareMalcolm Ellis
Log On heplnx2 25: ssh -l mellis lxplus.cern.ch mellis@lxplus.cern.ch's password: Last login: Mon Apr 23 16:40:32 2001 from 130.246.41.64 ******************************************************************************* * * * The LXPLUS Public Login Unix Service * * * * If you need help please contact HelpDesk@cern.ch tel 78888 * * * * http://cern.ch/ComputingRules : govern the use of CERN computing facilities * * * ******************************************************************************* IMPORTANT :: Consolidating efforts on LINUX LXPLUS has doubled its capacity, and the other PLUSes have been reduced to a minimal service. Noteably RSPLUS has stopped providing login facilities to the physics groups, and HPPLUS and DXPLUS only provide services to the LEP community. For more details please see the cern.computing news group or the latest CNL. ******************************************************************************* ------------- HARP Software Configuration (csh) ----------------------- HARP Software version v5 - CMT version v1r6 $CMTPATH is set to $HOME/mycmt:$HARP_DEV:$HARP_REL:$GAUDI_REL, packages will be searched in $CMTPATH and that order. ----------------------------------------------------------------------- [lxplus036] ~> HARP SoftwareMalcolm Ellis
Packages • Packages correspond to Folders in the Domains Diagram (next page) • The Domains Diagram can tell you which packages need to be updated if you change another. • Packages you are likely to encounter are: • ObjectCnv Decodes raw data from binary files • ObjyHarpEvent Access data from the database • HarpEvent Store of raw and reconstructed objects, etc. • HarpDD Harp Detector Description • HarpUI The event display • DetRep The detector representation from GEANT • DetResponse The detector response for simulation • Reconstruction The reconstruction HARP SoftwareMalcolm Ellis
Domains Diagram HARP SoftwareMalcolm Ellis
Getting a Package • Packages are accessed using the “getpack” command which handles CVS and CMT for us. • Packages are tagged in CVS as a major release (eg v5), a delta release (v5r1) or ongoing developments (a particular tag, or “head”) • The get pack command should be executed from the packages directory (that is, wherever mycmt is pointing) • To get the v5 release of Reconstruction: • getpack Reconstruction v5 • To get the v5r1 release of HarpEvent: • getpack HarpEvent v5r1 • To get the latest version of DetRep: • getpack DetRep v5r1 head • Unless you are developing specific Reconstruction code, you will use either a major or delta release. • It is not necessary to get every package that is used, CMT will handle things so that if you are using, say tests v5r1, then it will find the v5r1 versions of Reconstruction, ObjectCnv, etc from the official location. HARP SoftwareMalcolm Ellis
Getting a Package (II) • You can find out what the official versions are currently: • lvdev lists the current development versions • lvrel lists the current release versions • lvmycmt lists the versions that you have [lxplus036] ~> getpack Examples v5 checkout Examples version v5 tag cvs checkout: Updating v5 cvs checkout: Updating v5/Applications cvs checkout: Updating v5/Applications/BeamCounters U v5/Applications/BeamCounters/ObjectCnvDAQ.config U v5/Applications/BeamCounters/OptimizeBeamTofAlg.cpp ... cvs checkout: Updating v5/doc U v5/doc/release.notes cvs checkout: Updating v5/mgr U v5/mgr/requirements Creating setup scripts. Creating cleanup scripts. ================================================== type the following command: source $PWD/Examples/v5/mgr/setup.csh =================================================== HARP SoftwareMalcolm Ellis
Compiling a Package • Change to the mgr directory • cd tests/v5/mgr/ • You can check which versions it will use: • cmt show uses • If all is well, compile: • gmake • If the package results in an executable, then it is necessary to nominate the particular executable, otherwise all of them will be compiled. • For example, in tests, there are a series of Reconstruction tests, which can be compiled with: • gmake Reconstruction_UnitTest • Details of the applications are found in the “requirements” file in the mgr directory. • Do not believe a message that says everything is OK. When compiling a library, check that with the lib<libraryname>.a and lib<libraryname>.so files have been produced in the i386_linux22 directory! HARP SoftwareMalcolm Ellis
Applications • Quite different to the normal FORTRAN/C way of doing things. • The executable that is compiled contains a small amount of code, necessary to start up GAUDI and read in a jobOptions file. • The jobOptions file specifies a set of libraries and algorithms. • The libraries are loaded at run time and searched for the listed algorithms. • Assuming all the algorithms are found, they are executed in the order specified in the jobOptions file. • GAUDI handles the event loop, and thus an Algorithm must conform to a specific design (discussed later). • An algorithm can have user defined parameters (eg input file name, output ntuple name, number of events to process, chi2 cuts, etc…) these can be set in the jobOptions file. • Algorithms should use the GAUDI output stream so that verbosity can be controlled with the jobOptions file. HARP SoftwareMalcolm Ellis
jobOptions file • Default is job Options.txt, but can be named anything • Typical names used in Reconstruction tests are jobOptions.raw and jobOptions.mc • Example: • This is pure comments, C++ style is used, so // is ignored. • Include the default options: • Number of events to process: //############################################################## // // Job options file of the // Reconstruction - CreateTpcClusters Unit Test // from Monte Carlo // //======================================================= #include "../../../Common/UnixStandardJob.txt" //-------------------------------------------------------------- // Event related parameters //-------------------------------------------------------------- // Number of events to be processed (default is 10) ApplicationMgr.EvtMax = 1; HARP SoftwareMalcolm Ellis
jobOptions continued... • Declare any services that are needed: • Declare the libraries that are loaded at runtime (DLLs): • Declare the algorithms in the order they will be executed: //-------------------------------------------------------------- // Private application configuration options //-------------------------------------------------------------- ApplicationMgr.ExtSvc += { "HarpDDSvc", // Should be in this order "DetRepSvc" } ApplicationMgr.DLLs += { "Reconstruction" }; ApplicationMgr.TopAlg += { "TpcClusterAlg", "TpcTrackFindAlg", "FitTpcTracksAlg", "CreateSummariesAlg" }; HARP SoftwareMalcolm Ellis
jobOptions continued... • Some services may require their own options: • You can set the verbosity for the whole program and then modify it for individual algorithms: • When producing histograms or ntuples: HarpDDSvc.DetDataFiles = { "$HARP_FILES/$HARP_SOFT_VERS/Geometry/materials.geom", "$HARP_FILES/$HARP_SOFT_VERS/Geometry/rotations.geom", "$HARP_FILES/$HARP_SOFT_VERS/Geometry/Harp.geom" }; //-------------------------------------------------------------- // Output level 1=VERBOSE, 2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL //-------------------------------------------------------------- MessageSvc.OutputLevel = 6; TpcTrackFindAlg.OutputLevel = 2; CreateSummariesAlg.OutputLevel = 3; //-------------------------------------------------------------- // Histogram output file //-------------------------------------------------------------- HistogramPersistencySvc.OutputFile = "histo.hbook"; NTupleSvc.Output = { "FILE1 DATAFILE='tuple.hbook' OPT='NEW' TYP='HBOOK'" }; HARP SoftwareMalcolm Ellis
More jobOptions • Private options for the Algorithms: • Especially when using data, there will be other options, to select a run range, beam conditions, etc. • Finally, more comments: //-------------------------------------------------------------- // Algorithms private options //-------------------------------------------------------------- TpcTrackFindAlg.pointsForCalculation = 3; TpcTrackFindAlg.minimumNrPoints = 3; TpcTrackFindAlg.radius = 1000.0; TpcTrackFindAlg.openingAngle = 6.0; //============================================================== // // End of job options file of the // Reconstruction - CreateTpcClusters Unit Test // from Raw data // //############################################################## HARP SoftwareMalcolm Ellis
Running an Application • The executable that is compiled will be living in a directory called i386_linux22. • The program should be run from the directory that contains the jobOptions file. • It is necessary to first source a setup script. In principle there is one for sh (and similar shells) and a different one for csh (and similar), however the csh one doesn’t work, so I change to sh to run: • sh • source ../../../mgr/setup.sh • ../../../i386_linux22/Reconstruction_UnitTest.exe jobOptions.raw • If you run sh, zsh, etc, then it is necessary to source this script only once per session, if you source it multiple times, problems will occur! • A missing library will produce an error like: • System::loadDynamicLib>libReconstruction.so: cannot open shared object file: No such file or directory • A missing Algorithm (for example a typo in the jobOptions file) looks like: • System::loadDynamicLib>libTypoAlg.so: cannot open shared object file: No such file or directory HARP SoftwareMalcolm Ellis
GAUDI • GAUDI provides the framework for the HARP software. • It handles many aspects of the execution of an application, such as: • The processing of the jobOptions file • Loading the Libraries at run time • Providing the event loop control • Passing user defined options to Algorithms when they are instantiated • Providing a message service, which allows the user to control the verbosity of the Algorithms being executed • Providing a histogramming service, which includes ntuples. • Provides memory stores, which are used to hold instances of various classes for an event, or an entire run • GAUDI is not the only supporting library, others include: • GEANT4, used for the detector description in the Event Display and Reconstruction and of course for the Simulation • CLHEP, which is a library of general classes useful for high energy physics. An example is the definition of units, so quantities can be expressed as 5*mm or 5.2*GeV, etc… • Objectivity, is of course, the database HARP SoftwareMalcolm Ellis
Stores • There are two stores used in the HARP environment, the Event Store and the Detector Store: • The Event Store is used to hold objects which are only relevant for a single event, for example, NdcHits, MwpcTracks, ParticleSummaries, etc… A vector of pointers to instances of a given class can be sent to the store and named with a string. The convention in the HARP software is that this string is defined in a .h file as part of a namespace and so any errors are detected at compile time, rather than run time. A corresponding system is used to retrieve a pointer to the vector. • The Detector store is used to hold objects which are to persist for a given processing run. Typical examples are calibration and alignment classes, defined in HarpDD/CalibAlign such as NdcWire, which holds information about the geometrical and electronic specifications of a particular wire in the NOMAD drift chambers. HARP SoftwareMalcolm Ellis
Message Service • The message service should be used instead of the usual cout. • The service can be accessed through a declaration like: • MsgStream log( msgSvc(), name() ); • This declares a stream “log” which can be used instead of cout. It is different to cout, in that the first thing you pass is a verbosity level: • log << MSG::INFO << “This is for information” << endreq; • The different levels are: • MSG::VERBOSE • MSG::DEBUG • MSG::INFO • MSG::WARNING • MSG::ERROR • MSG::FATAL HARP SoftwareMalcolm Ellis
Histogram Service • The Histogram Service allows a reasonably easy way of producing histograms and ntuples from within an algorithm. • In either case, a private member variable(s) of the algorithm must be included in the class definition. This variable will then be used to declare the histogram or ntuple in the initialize() function and fill the histogram or ntuple in the execute() function. • Ntuples are written at the end of each event, in the execute() function, while histograms are written once, at the end of processing in the finalize() function. • The name of the output hbook file is defined in the jobOptions. • There are a number of examples of the use of histograms and ntuples in the Examples, Applications and tests packages. HARP SoftwareMalcolm Ellis
Reconstruction • Contains Algorithms to reconstruct the raw data and the definitions of the classes that are produced by these Algorithms. • Each subdetector is reconstructed individually, into hits, clusters, tracks, etc… • “HarpParticle”s are made by matching tracks, clusters, etc.. from a single particle. • Summary Objects (which are defined in HarpEvent/Rec and therefore visible to the database and event display) are created from the reconstructed objects. These objects form the basis for a future DST. • Currently reconstructs 4 detectors: • Beam MWPCs, produces hits, 3D points and tracks • TPC, produces clusters and basic tracks • NDC, produces, hits, duplets, triplets, tracks and summary objects • CAL, produces clusters in the ECAL, HCAL, Muon ID and Cosmic Wall. • HarpParticles are seeded from NdcTracks and CAL clusters are added if possible. ParticleSummaries are made from the HarpParticles. HARP SoftwareMalcolm Ellis
NdcTriplet • NdcTriplet is an example of a class representing a Reconstructed object: class NdcTriplet : public NdcPoint { public: // Constructor NdcTriplet( NdcHit*, NdcHit*, NdcHit* ); // Retrieve pointer to class defininition structure virtual const CLID& clID() { return NdcTriplet::classID(); }; static const CLID& classID() { return CLID_NdcTriplet; }; // Determine the best residual of this triplet double Residual() const; // Return the position of the triplet Hep3Vector Position() const; // Return the uncertainty in the triplet position // RMatrix Resolution(); // Compare the position of the triplet with an extrapolated point bool CheckPosition( Hep3Vector ) const; // The Hits NdcHit* UHit() const; NdcHit* XHit() const; HARP SoftwareMalcolm Ellis
NdcTriplet (II) NdcHit* VHit() const; // The sign of the hit in the U plane int USign() const; // The sign of the hit in the X plane int XSign() const; // The sign of the hit in the V plane int VSign() const; // The chamber this triplet is in NdcChamber* Chamber() const; // Has this triplet been used in a track? bool IsUsed() const; // Designate this triplet as being used void SetUsed(); // Keep this triplet void Keep() const; // Check that all the hits are free to be used in this triplet bool CheckHits() const; // and so on… HARP SoftwareMalcolm Ellis
Example – Ntuple from NdcTriplets • In this example, we will write an Algorithm to create an ntuple holding the positions of the NdcTriplets. • We will then write a jobOptions file that allows us to use the Reconstruction algorithms to create the Triplets and then call the new Algorithm. • NtupleTripletsAlg • NtupleTripletsAlg.h – defines the class, declares private member variables which will be used to create the Ntuple. • NtupleTripletsAlg.cpp – definition of the member functions, the initialize function is executed once, before the event loop starts and sets up the ntuples. The execute function is run once per event and will be used to read in the triplets and fill the ntuple. The finalize function is executed at the very end and in this case is not used. HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.h // -*- C++ -*- #ifndef EXAMPLE_NTUPLETRIPLETSALG_H #define EXAMPLE_NTUPLETRIPLETSALG_H 1 //! NtupleTripletsAlg // author: M.Ellis // // ! Include needed files from FW // these allow the Algorithm to be run // and give us the ability to make // histograms and ntuples #include "GaudiKernel/Algorithm.h" #include "GaudiKernel/Property.h" #include "GaudiKernel/MsgStream.h" #include "GaudiKernel/IHistogramSvc.h" #include "GaudiKernel/IAxis.h" #include "GaudiKernel/IHistogram1D.h" #include "GaudiKernel/IHistogram2D.h" #include "GaudiKernel/NTuple.h" // streams and such #include <string> #include <iostream> #include <fstream> #include <strstream> #include <vector> //! ClassName: NtupleTripletsAlg class NtupleTripletsAlg : public Algorithm { public: // !Constructor of this form must be provided NtupleTripletsAlg( const std::string& name, ISvcLocator* pSvcLocator ); // !Three mandatory member functions of any algorithm StatusCode initialize(); StatusCode execute(); StatusCode finalize(); private: int m_initialized ; double m_scale; // NdcTriplet NTuple (1) NTuple::Item<long> m_num_trip; NTuple::Array<float> m_trip_x; NTuple::Array<float> m_trip_y; NTuple::Array<float> m_trip_z; }; #endif // EXAMPLE_NTUPLETRIPLETSALG_H HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp • There are the usual set of #includes due to GAUDI, etc. • In addition, you need: • #include "NtupleTripletsAlg.h“ • #include “Reconstruction/TopLevel/PrivateRecEventModel.h" • #include “Reconstruction/Ndc/NdcTriplet.h“ • It is then necessary to declare a Factory: • static const AlgFactory<NtupleTripletsAlg> Factory; • const IAlgFactory& NtupleTripletsAlgFactory = Factory; • Next, a constructor: • Now, we can write the member functions… NtupleTripletsAlg::NtupleTripletsAlg(const std::string& name, ISvcLocator* ploc) : Algorithm(name, ploc) { //------------------------------------------------------------------------------ m_initialized = false; } HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp (II) • Intitialize() will setup the ntuples: //------------------------------------------------------------------------------ StatusCode NtupleTripletsAlg::initialize() { //------------------------------------------------------------------------------ // avoid calling initialize more than once if( m_initialized ) return StatusCode::SUCCESS; MsgStream log(msgSvc(), name()); // Declare Algorithm Properties declareProperty( “Scale", m_scale = 1.0 ); // Produce NTuple NTuple::Directory *dir=0; NTupleFilePtr file1(ntupleSvc(),"/NTUPLES/FILE1"); if(file1) { if(!(dir=ntupleSvc()->createDirectory("/NTUPLES/FILE1/NDC"))) log << MSG::ERROR << "Cannot create directory /NTUPLES/STAT/NDC" << endreq; } else log << MSG::ERROR << "Cannot open NTUPLE file" << endreq; HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp (III) NTuplePtr nt(ntupleSvc(),"/NTUPLES/FILE1/NDC/1"); if(!nt) { nt = ntupleSvc()->book( dir, 1, CLID_ColumnWiseTuple, "NdcTriplets" ); if( nt ) { nt->addItem( "numtrip", m_num_trip, 0, 1000 ); nt->addItem( "x", m_num_trip, m_trip_x ); nt->addItem( "y", m_num_trip, m_trip_y ); nt->addItem( "z", m_num_trip, m_trip_z ); } else { log << MSG::FATAL << " Unable to create NTUPLE" << endreq; return StatusCode::FAILURE; } } m_initialized = true; return StatusCode::SUCCESS; } HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp (IV) • Execute() has a number of tasks to perform, firstly, it must retrieve the vector of NdcTriplets: • Now we can loop over the Triplets and fill the Ntuple… //------------------------------------------------------------------------------ StatusCode NtupleTripletsAlg::execute() { //------------------------------------------------------------------------------ MsgStream log( msgSvc(), name() ); log << MSG::INFO << " NtupleTripletsAlg executing...." << endreq; // Get Triplet vector SmartDataPtr<NdcTripletVector> ReadNdcTriplets( eventSvc(), EventModel::PrivateRec::NdcTriplets ); if( !ReadNdcTriplets ) { log << MSG::WARNING << " Unable to retrieve NdcTriplets " << endreq; m_num_trip = 0; } else { HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp (V) // Loop over all Triplets NdcTripletVector::const_iterator titer; for( titer = ReadNdcTriplets->begin(), i=0; titer != ReadNdcTriplets->end(); ++titer ) { Hep3Vector pos; // Get the position of this triplet pos = (*titer)->Position(); // Fill the ntuple elements m_trip_x[i] = pos.x(); m_trip_y[i] = pos.y(); m_trip_z[i] = pos.z(); ++i; } m_num_trip = i; } HARP SoftwareMalcolm Ellis
NtupleTripletsAlg.cpp (VI) • Now we can write the Ntuple for this event: • The Finalize() method does very little: // Write the NdcTriplet NTUPLE ntupleSvc()->writeRecord("/NTUPLES/FILE1/NDC/1"); return StatusCode::SUCCESS; } //----------------------------------------------------------------------------- StatusCode NtupleTripletsAlg::finalize() { //----------------------------------------------------------------------------- MsgStream log(msgSvc(), name()); log << MSG::INFO << "finalizing...." << endreq; m_initialized = false; return StatusCode::SUCCESS; } HARP SoftwareMalcolm Ellis
How to include the Algorithm • There are two ways of doing this: • Include the Algorithm in a test, in principle this would be the most logical thing to do, however as our Algorithm is directly accessing private classes from Reconstruction, it will not be allowed to compile in tests. • So, we shall include it into the Reconstruction Library. • Step 1, get the current version of Reconstruction: • cd ~/mycmt • getpack Reconstruction v5r1 • Step 2, find the directory where the Ndc code lives: • cd Reconstruction/v5/src/Reconstruction/Ndc/ • Step 3, Now either create the NtupleTripletsAlg.* files from scratch or copy them from my directory ~mellis/public/HarpUK/ • Step 4, get the current version of tests: • cd ~/mycmt/ • getpack tests v5r1 • Now, find the NDC UnitTest: • cd tests/v5/UnitTests/Reconstruction/CreateNdcTriplets/ • This is where the jobOptions file will go. HARP SoftwareMalcolm Ellis
Compiling the Algorithm • It is now time to compile the Reconstruction library: • cd ~/mycmt/Reconstruction/v5/mgr/ • gmake • Next, we want to compile the Reconstruction test: • cd ~/mycmt/tests/v5/mgr/ • Note that if we just typed gmake here, it would compile all of the tests, which would take some time, so instead we nominate a particular test, that is listed in the requirements file: • gmake Reconstruction_UnitTest • Assuming all has gone well, you should now have a library with your algorithm in it and a test program that can now be used with a new jobOptions file… HARP SoftwareMalcolm Ellis
New jobOptions file #include "../../../Common/UnixStandardJob.txt" //-------------------------------------------------------------- // Event related parameters //-------------------------------------------------------------- // Number of events to be processed (default is 10) ApplicationMgr.EvtMax = 500; //-------------------------------------------------------------- // Private application configuration options //-------------------------------------------------------------- ApplicationMgr.ExtSvc += { "HarpDDSvc", // Should be in this order "DetRepSvc" }; ApplicationMgr.DLLs += { "Reconstruction", "ObjectCnv" }; ApplicationMgr.TopAlg += { "LoadNomadCAAlg", "LoadRawFromDAQAlg", "CreateNdcHitsAlg", "CreateNdcHitsFromEventHitsAlg", “CreateNdcTripletsAlg”, "NtupleTripletsAlg" }; HarpDDSvc.DetDataFiles = { "$HARP_FILES/$HARP_SOFT_VERS/Geometry/materials.geom", "$HARP_FILES/$HARP_SOFT_VERS/Geometry/rotations.geom", "$HARP_FILES/$HARP_SOFT_VERS/Geometry/Harp.geom", "$HARP_FILES/$HARP_SOFT_VERS/Geometry/Ndc.geom" }; HARP SoftwareMalcolm Ellis
New jobOptions file (II) //-------------------------------------------------------------- // Output level 1=VERBOSE, 2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL //-------------------------------------------------------------- MessageSvc.OutputLevel = 6; //-------------------------------------------------------------- // Histogram output file //-------------------------------------------------------------- HistogramPersistencySvc.OutputFile = "histo.hbook"; NTupleSvc.Output = { "FILE1 DATAFILE='tuple.hbook' OPT='NEW' TYP='HBOOK'" }; //-------------------------------------------------------------- // Algorithms private options //-------------------------------------------------------------- // NtupleTripletsAlg.Scale = 0.5; LoadNomadCAAlg.MonteCarlo = false; LoadRawFromDAQAlg.RawDataSource = "$HARP_FILES/$HARP_SOFT_VERS/Raw/Evb0_2020.dat"; CreateNdcHitsAlg.ConfigFile = "$HARP_FILES/$HARP_SOFT_VERS/Electronics/ObjectCnvDAQ_CR.config"; HARP SoftwareMalcolm Ellis
Running the Example • Either type in the jobOptions.example file, or copy it from my HarpUK directory. • Change to sh (or similar), source the setup.sh and run the executable: • sh • source ../../../mgr/setup.sh • ../../../i386_linux22/Reconstruction_UnitTest jobOptions.example • At the moment, it will produce very little output, as the jobOptions file includes the line “MessageSvc.OutputLevel = 6;” • You can make the new Algorithm more verbose by adding a line: • NtupleTripletsAlg.OutputLevel = 3; • Now, depending on the verbosity chosen, output produced through the message service in the new algorithm can be seen. HARP SoftwareMalcolm Ellis
Homework • We have defined a user option “Scale” for NtupleTripletsAlg, however it hasn’t been used so far. • Change one or more variables in the Ntuple, so that the value depends on Scale (my idea was to scale the Triplet position before storing it in the Ntuple, hence the name…) • Uncomment the line in the jobOptions file where the Scale is defined and check that it has the desired effect. • Remember that Scale is just a label for the jobOptions, the actual value will be stored in the variable m_scale that is defined in the .h file. • An example of using the event display is the SystemTest “FullChain”, it is necessary to compile the FullChain executable and then run using the appropriate jobOptions file. • It is possible to run the Event Display from the UK over the network, however beware it is slow! • Once the display comes up, wait until the mouse symbol changes and then click anywhere on the black window, this should result in HARP being drawn. • You can now read and reconstruct an event with the “Next” button. HARP SoftwareMalcolm Ellis