100 likes | 198 Views
LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00. Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources 2001 2002 2003
E N D
LHCb(UK) Computing/Grid:RAL PerspectiveGlenn Patrick 08.06.00 Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources2001 2002 2003 Processors (PC99-450MHz) 830 1670 3100 Disk (TB) 25 50 125 Tape (TB) 67 130 330 Glenn Patrick (g.n.patrick@rl.ac.uk)
What exists now?RAL-CSF: Main LHCb Platform • Currently, 160*P450 equivalent processors. • Hope to expand to ~300*P450 in September. • Linux Redhat 6.1 being phased in on all machines • (HPs being shut down) to give compatibility with CERN (eg. lxplus). • PBS (Portable Batch System) not NQS. • 1TB+ of robotic tape space for LHCb. • 500GB+ of disk space for LHCb (need to request). • Globus toolkit v1.1.1 installed on front-end (with testbed service on another machine).
HP BATCH LINUX BATCH HP LINUX SUN FDDI NIS DataStore userids Disk Farm n TB Scratch /home AFS RAL Particle Physics Unix Services 100 Megabit Switched Network DataStore
LHCb Software • LHCb software stored in 4GB AFS project space /afs/rl.ac.uk/lhcb • Updated just after midnight every night. • CMT/CVS installed (although no remote updating to CERN repository). • Crude LHCb environment at the moment, but managed to process events through SICBMC with little knowledge of LHCb software. • Available for LHCb to exploit for detector, physics & Grid(?) studies.
MC Production: RAL NT Farm • 18*450MHz PII + 9*200MHz Pentium Pro • LHCb frontend in addition to dual-cpu frontend. • Production capacity 100k-200k events/week. • 500k bb events processed so far and stored in RAL DataStore. • Events now transferred over network to CERN using RAL VTP protocol instead of DLTs. • Thanks to Dave Salmon,Eric van H & Chris Brew. • Latest production code being installed (DS).
RAL NT FarmNew Front-end & extra batch nodes LAN & WAN 18GB DAT 100Mb/sswitch CPU 2 3 9 + 14 14 Front End PDC Batch Node 4 10 15 File Server 4 + 4 GB Peripherals 5 11 16 New Systems 1 6 12 17 BDC BDC 8 7 13
Grid Developments There is now a “CLRC Team” for the particle physics grid + several work groups (GNP represents LHCb with CAJB also a member). • Important that this is beneficial for LHCb. • EU (DataGrid) application to distribute 107 events & 3TB using MAP/RAL/... does not start production until 2002. • Need to start now and acquire some practical experience and expertise decide way forward.
Grid Developments II Meetings: 14th June(RAL) Small technical group to discuss short term LHCb aims, testbeds, etc. (CERN,RAL,Liverpool,Glasgow…) 21st June(RAL) Globus Toolkit User Tutorial 22nd June(RAL) Globus Toolkit Developer Tutorial Open to all, register at... http://www.globus.org/news/uk-registration.html 23rd June(RAL) Globus “strategy” meeting. (invitation/nomination)
Desktop users Which Grid Topology for LHCb(UK)? Flexibility important. CERN Tier 0 INFN IN2P3 etc…. Tier 1 RAL etc…. etc…. Tier 2 etc…. Liverpool Glasgow Edinburgh Department
Grid Issues • Starting to be asked for estimates of LHCb resources (central storage, etc) and Grid requirements for applications and testbeds. • Useful to have a LHCb(UK) forum for discussion & feedback define model for all UK institutes, not just RAL. • Any documentation (including this talk) on computing/software/Grid at... • http://hepwww.rl.ac.uk/lhcb/computing/comphome.html