180 likes | 290 Views
User Board or User Bored?. Glenn Patrick GridPP19, 29 August 2007. Grids need users. WLCG EGEE OSG NGS. GridPP. USER. Board. User. Users need a Board?. Not quite – more of a User Forum. Board needs a Chair. 2001-2003. 2003-2004. 2004-2005. LHCb. BaBar. Other Experiments.
E N D
User Board or User Bored? Glenn Patrick GridPP19, 29 August 2007
Grids need users WLCG EGEE OSG NGS GridPP USER
Board User Users need a Board? Not quite – more of a User Forum
Board needs a Chair 2001-2003 2003-2004 2004-2005 LHCb BaBar Other Experiments 2005-2006 2006 - CMS LHCb
LHCb CMS BaBar ATLAS LHCb CMS BaBar ATLAS Tier 1 Fairshares – last 12 months ...Requested Reality…
Also an underlying profile for LHC! 2011 SuperLHC 2010 2009 2008 2007 Napes Needle Tier 1 CPU – 2007 Profile
2007 at the Tier 1 CPU. Largely met demand through the first half of the year. Resources start to become over-allocated in September when ATLAS and CMS activities are expected to peak. An increasing shortfall of CPU across Q4 is predicted, largely due to the LHC experiments ramping up in preparation for 2008. DISK. After the difficulties of 2006, the disk situation through the first 6 months of 2007 has been good with sufficient headroom to provide experiments with the requested capacity, as well as extra resources to assist with the testing of Castor and migration from dCache. There is still headroom through Q3, but Q4 will be challenging and experiments will probably have to wait for the new disk deployment in January 2008. TAPE. As always with tape (and the vagaries of Castor repacking, etc), it is difficult to be certain of the physical headroom. Estimates of the unwanted tape storage of the disk1tape0 storage class have had to be included for Castor (hopefully solved when v2.1.4 is deployed). Some allowance has also been made for migration from dCache to Castor.
LHC Schedule – End/Start in Sight! We are here! 8 months May 2008 General schedule Baseline rev. 4.0 Global pressure test &Consolidation Warm up Interconnection of the continuous cryostat Global pressure test &Consolidation Leak tests of the last sub-sectors Powering Tests Flushing Powering Tests Inner Triplets repairs & interconnections Cool-down Cool-down
2008 Planning Underway x 2.7 x 3.3 x 2.8
dCache – Castor2 Migration The migration to Castor continues to be a challenge! At the UB meeting on 20 June it was agreed that 6 month notice be given for dCache termination. Experiments have to fund storage costs past March 2008 for ADS/vtp tape service. Castor Data
Castor Progress at RAL • Separate Instances for LHC Experiments • ATLAS Instance - Version 2.1.3 in production. • CMS Instance - Version 2.1.3 in production. • LHCb Instance - Version 2.1.3 in testing. • Issues • Strategy meetings and weekly experiment technical meetings helped a lot with progress. Current issues: • Tape migration rates. • Monitoring at RAL. • SRM development (v2.2 timescale). • disk1tape0 capability. • Repack. But upcoming Data Challenges of CMS (CSA07) and ATLAS (M4, FDR) + LHCb, ALICE will be the real test of Castor.
Lab m Uni x regional group CERN Tier 1 Uni a UK USA Lab a France Tier 1 Tier3 physics department Uni n CERN Tier2 ………. Italy Desktop Lab b Germany ………. Lab c Uni y Uni b physics group LHC Computing Model 2001 The LHC Computing Centre les.robertson@cern.ch
Grid Only Tier 1 • After several discussions, non-Grid access to the Tier 1 is scheduled to finish at the end of 2007 except for a few exceptions. • Use cases for the exceptions are being identified: • Debugging production jobs. • Maintaining experiment environment. • Start up of new experiments. • etc • Limited User Interface Service. Important to retain functionality and flexibility. • Implications for RAL AFS service (cell). This is bad news for BaBar who rely on AFS for software distribution.
SL4 and 64 Bit Migration Migration to SL4 discussed several times. A new SL4 CE with 20% of the batch capacity commissioned at Tier 1 during the first week of August. Generally, not an issue with experiments, but some teething problems (e.g. LHCb and CMS). Experiment attitudes towards true 64 bit applications (as opposed to 32 bit applications running in compatibility mode) surveyed: ATLAS - not important at the moment. LHCb – Can test, but what about middleware? BaBar/MINOS – No immediate interest. MICE plan to move to 64 bit computing as soon as underlying computing resources become available.
1032 cm-2 s-1 1033 1034 Real LHC Data now on Horizon Already here in the case of ATLAS cosmics! Need to be prepared for increasing luminosity and surprises (good and bad). Variable Backgrounds Changing Beam Energies Important to handle issues as they arise (and if possible anticipate). More direct communication needed in addition to quarterly UB meetings. CMS
User Board Future UB and experiment interaction needs to progress as experiments evolve. Suggestions on this are welcome.
The End (and The Start) GridPP3 GridPP2 GridPP2+