160 likes | 302 Views
UK Tier 1 Centre. Glenn Patrick LHCb Software Week, 28 April 2006. UK Tier 1. Rutherford Appleton Laboratory. Diamond. Particle Physics. ISIS. 2005. 2004. LHCb. ATLAS. LHCb. ATLAS. BaBar. BaBar. UK Tier 1 Exploitation. Largest GridPP users by VO for 2005. BaBar. ATLAS. BIOMED.
E N D
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006
UK Tier 1 Rutherford Appleton Laboratory Diamond Particle Physics ISIS
2005 2004 LHCb ATLAS LHCb ATLAS BaBar BaBar UK Tier 1 Exploitation
Largest GridPP users by VO for 2005 BaBar ATLAS BIOMED CMS ZEUS DZERO LHCb LHCb Tier 1 = 505,921 KSI2K hours LHCb Tier 2 = 983,050 KSI2K hours Now 23 approved VOs NB: Excludes data from Cambridge – for Condor support in APEL see Dave Kant’s talk
UK Tier 1 CPU • Current capacity 830 KSI2K. • Some units now 4 years old. • Extra 266 KSI2K delivered 10 March and should be available at start of May after 4 week load test. • Twin dual-core Opteron 270s: • 1GB RAM/core • 250 GB HDD • Total CPU Capacity = 1,096 KSI2K
UK Tier 1 Disk Current capacity 177TB. Extra 168TB and 21 servers delivered on 10 March. Some teething problems. Fix has been generated and tested. Hopefully, resume installation this week with target for end May. Total Disk Storage ~308TB (after retirements).
New SL8500 tape robot • New STK SL8500 tape robot. • Replaces STK Powderhorn 99310 robot (single arm, 9940 drives) • Funded by CCLRC. Entered service 28th March 2006. • 6,000 slots. Upgrade to 10,000 slots later in the year giving a capacity of 5 PBytes. • 8 mini robots mounting tapes – faster, more resilient. • T10000 tape drives (Castor and not ADS at moment). • 318TB(Feb) 336TB 446TB (extra T10K media)
CASTOR2 Deployment Mar-Apr Testing: functionality, interoperability and database stressing May-Sep Spec and deploy hardware for production database May Internal throughput testing with Tier 1 disk servers Jun CERN Service Challenge throughput testing Jul-Sep Create full production infrastructure; full deployment on Tier1 Sep-Nov Spec and deploy second phase production hardware to provide full required capacity Apr 07 Startup of LHC using CASTOR at RAL dCache available until early 2007
Tier 1 Services LHCb VO Box available since January 2006: • lcgvo0339.gridpp.rl.ac.uk • Configuration/service certificate by Raja et al. LHCb VO Box DC06-02 requires database service supporting COOL and 3D in October. New hardware ordered: • 4 servers – Dual AMD Opteron 250 (ATLAS and LHCb) • 3.5TB storage array for both. Ref. Database workshop, RAL, 23 March
UK CPU Allocations Overallocated – scale by 0.81 Underallocated
LHCb CMS BaBar ATLAS LHCb T1 Experiment Shares
UK Disk Allocations 32.2TB= 2.2TB stripped DST(DC06) + 30TB additional signal production Deployment of new disk? If this fails, encouraged by GRIDPP to use Tier 2 disk!
UK Tape Allocations Sum(all experiments) 372TB Sum(all experiments) 623TB
UK Tier 1 Status Total Available (April 2006) CPU = 830 KSI2K (500 dual cpu) Disk = 177 TB (60 servers) Tape = 318 TB LHCb(UK) Tier 1 (April 2006) CPU = 150 KSI2K Disk = 10 TB Tape = 20 TB Total Available (Later in 2006) CPU = 1,096 KSI2K Disk = 308 TB Tape = 446 TB LHCb Tier 1 2006 (1/6 share) CPU = 222 KSI2K Disk = 122 TB Tape = 103 TB
The Future! GRIDPP1 Prototype Grid £17M, complete. September 2001 – August 2004 GRIDPP2 Production Grid £16M, ongoing. September 2004 – August 2007 GRIDPP3 “Exploitation” Grid? PPARC have just issued the call for a bid to cover the period 2007 -2011. To be submitted by 13 July. LHCb input required now. 2007 – 2011 GridPP3?