80 likes | 238 Views
Brunel Grid Activities Report. Peter van Santen Distributed and Grid Computing Group peter.van.santen@brunel.ac.uk. Background. Dept. of Electronic and Computer Engineering PP group (Detectors, CMS, BarBar, etc.) 7+ DC group (parallel systems, cluster computing, etc.) 4+
E N D
Brunel Grid Activities Report Peter van Santen Distributed and Grid Computing Group peter.van.santen@brunel.ac.uk
Background Dept. of Electronic and Computer Engineering • PP group (Detectors, CMS, BarBar, etc.) 7+ • DC group (parallel systems, cluster computing, etc.) 4+ ‘Grid computing interest group’ June 2001 (common interest, ad hoc) Unfunded Grid effort Applied for SRIF funding as part of the London eSience Consortium (£ 0.75m) BITlab to support collaborative projects for a range of disciplines. 48 (64) dual processor node cluster, 4 Tb storage, 1 Gbit network and 1 Gbit link.
BITLabBrunel Information Technology Laboratory 1 Gbit switch Evans & Sutherland3D imagegen Elumensvisionstn 1 Tbytedatastore Videowallserver Videowall 48-64 node2(2 XEON)cluster1 Gb Cu network switch Graphicsworkstations 1 Gbit Fibre to CC 100Mb ?1Gb/2.5Gb?
Resources Staff 5+ Academic (pt effort) 1 RF CMS 2 RA’s DataTAG, BarBar 2 Postgrad DataGrid, Cluster Equipment (present) to support testbed 2 Athlon 700MHz VIA RH6.2 (2.2.19) 1 dual PIII 800MHz 440GX RH6.2 24 PIII 800MHz 815 370SSA RH6.2 (now) (teaching) ? P4 1.8GHz P4SBA 845 RH7.1 1 Dual P4 (2 Xeon/Die) 1.8GHz 0.5Gb P4CD6+ 860 RH7.1 2.4.2 (eng.rel.)
Planning Testbed support • Testbed compliance (range over hw) May/July (supermicro/intel support) • Testbed active Dec (2001) Feb. • Cluster prototyping May June testbed • Cluster operational September testbed Skill transfer Linux, Globus, Testbed, etc. 3 (pt) 3 (pt) + 4 (ft) May/June
Progress • Sept 2001 RH6.2 2.2.19 VIA and 815 Globus • Developed infra-structure to support testbed • Agreement for QoS at 100Mbs to ext. • Agreement for subnet outside University CC firewall. • Successful measurements on i860 dual prestonia (jackson) (see Richard Hughes-Jones DataGrid WP7). • 2 Machines (athlon 800MHz 0.5Gb) testbed software being installed, housed with main switch. • Main effort to date unfunded 2.5 funded posts
StatusGrid Computing Group • 4 Academic staff active • 5 Research staff active • 2 Machines testbed ready • 24 Machines compliant (available o/s teaching) • Higher performance boards being evaluated • End Feb RHJ and PvS 2 day evaluation of Plumus chipset (performance/compliance)
Summary • Overcome infrastructure problems re connectivity • High learning curve for new staff • August start up difficult mainly documentation • Installation problems during Aug/Sept due to chipsets, etc. • Lack on known publicised standards • In house mainly RH 7.1, Mandrake 8.0 (SKYLD pvm cluster) • RH6.2 compliance resolved (in most cases) • Funding and funded posts • Strategy HP motherboards (cluster, etc.) in parallel with testbed Support: DataGrid, DataTAG, CMS, BarBar, etc Collaborative effort improve technology information transfer