140 likes | 234 Views
LCD Computing Setup. Server Specs and Setup Desktop Cluster Organization Physics Software Distribution Plans and Schemes. Jeremy McCormick, Sergey Uzunyan, Guilherme Lima, et al. Server Hardware Specs. Dual-CPU Athlon: 2.133 GHz 2 Gb RAM HD: 1 master disk, 1 secondary, 4 RAID
E N D
LCD Computing Setup Server Specs and Setup Desktop Cluster Organization Physics Software Distribution Plans and Schemes Jeremy McCormick, Sergey Uzunyan, Guilherme Lima, et al
Server Hardware Specs • Dual-CPU Athlon: 2.133 GHz • 2 Gb RAM • HD: 1 master disk, 1 secondary, 4 RAID • ~1 TB total storage
Server Configuration • Red Hat 9 • kernel 2.4.20-13.9smp • hostname k2 • IP 131.156.85.141 • accessible via SSH
Server Filesystem Filesystem Size Mount Description /dev/hda1 13 G / root fs /dev/hda3 19G /home home dirs /dev/hdb2 185G /k2bkp backup disk /dev/lcd/k2dist 9.9G /k2dist physics distribution /dev/lcd/k2work 145G /k2work project work dirs /dev/raid0/lcd_data 734G /k2data project data
NFS Structure Mount Description /k2dist physics software distribution /k2bkp backup /home/$USERNAME user home directory /k2work/$USERNAME project work directory /k2data/$USERNAME project data /sdisk/$MACHINE workstation shared disks
Physics Software Distribution • simulation, analysis & event generation • shared binaries, libraries, scripts, includes, etc. • usable from any node in desktop cluster • common environment & setup scripts • /k2work/$USERNAME application builds possible • /k2dist isolates physics apps from Linux fs
/k2dist directory structure Directory Description Contains apps applications directory tree with packages, libs, binaries bin executables shell scripts, binaries, symlinks to binaries config configuration info currently contains node list doc documentation application docs (pdf, html, ps, etc.) include source includes symlinks to include dirs install_files installation files install packages (tar.gz) in dirs lib libraries symlinks to libraries
/k2dist/apps Analysis jas, root Event Generation pandora-pythia, peg Libraries aida, boost, cernlib, clhep, freehep, g4phys, geant4, lcio, Mesa, pegs4, xml4c Simulation lcdg4, mokka, tbeam Utilities david, dawn
Useful /k2dist/bin Scripts • prjenv.sh • project env • add . /k2dist/prjenv.sh to ~/.bash_profile or ~/.cshrc • appenv.sh • application env • setup java vars, PATH, LD_LIBRARY_PATH, etc. • included by prjenv.sh • g4_5_2_p01env.sh • Geant4 env vars • included by prjenv.sh • nodes.sh • print node names, IPs, host • ex. = for n in `nodes.sh hn`; do • ping $n • done
Simple Remote Job test_job.sh # set project environment . /k2dist/bin/prjenv.sh # run job nohup testbeam -m /k2work/jeremy/run10.mac -o /k2data/jeremy/tb-test.txt &> /k2work/jeremy/tb-test.log & start job on node dvk [jeremy@lepton-physics jeremy]$ ssh rio /k2work/jeremy/test_job.sh get pid [jeremy@lepton-physics jeremy]$ ssh rio pgrep testbeam 22742 kill the job [jeremy@lepton-physics jeremy]$ ssh rio kill 22742
k2 + Desktop Cluster • new user setup • common development platform • “standard” NICADD physics apps • no more desktop installs • ample storage area • shared datasets • centralized authentication • distributed computing
Plans and Schemes • no more development on nicadd & individual desktops • setup & packaging scripts • migrate SIO-Server • batch computing (fbs, pbs) • physics software packages • framework applications • add more cluster nodes • world hegemony (or at least Western hemisphere)