160 likes | 250 Views
GridPP2: Application Requirement & Developments Nick Brook University of Bristol. ALICE Hardware Projections Applications Programme. Hardware. LHC experimental numbers based on ongoing re-assessment exercise Computing Technical Design Reports due in 2005 Expts will be using LCG “system”
E N D
GridPP2: Application Requirement & Developments Nick Brook University of Bristol • ALICE • Hardware Projections • Applications Programme Oxford eSc – 1st July’03
Hardware • LHC experimental numbers based on ongoing re-assessment exercise • Computing Technical Design Reports due in 2005 • Expts will be using LCG “system” • Hardware and chosen middleware • Software tools – e.g. POOL for persistency • Numbers include Tier-2 needs • Non LHC experiments also gave estimated forward look • Based on MC production & analysis • Expts are expecting a single, integrated Tier-1 centre Oxford eSc – 1st July’03
Hardware • Expts are expecting a single, integrated Tier-1 centre • Short term LHC expts expect some form of centralised planning via LCG project • Projection Execution Board Grid Deployment Board • GridPP participation in LCG bodies • GridPP will continue with annual h/w review • CPU vs Disk Oxford eSc – 1st July’03
Ongoing Activities Example: LHCb Data Challenge – >40M events – 170 yrs on a 1GHz PC ~1/3 events produced in the UK Oxford eSc – 1st July’03
Ongoing Activities Current usage of Tier-1/A centre dominated by BaBar usage – 60% of CPU, 90% of disk Oxford eSc – 1st July’03
Networking • Bandwidth dominated by replication in analysis stage of data processing • Use of tools, such as OptorSim, to understand networking • LHC expts need to understand computing & analysis models • Early estimates factor of 5 increase • Current problems with MC production & bulk transfer • Unrelated to SuperJANET • Often attributable to links into the MAN Oxford eSc – 1st July’03
CPU estimates • CPU resource reqts are equivalent to 14k 2.4GHZ dual processors running continuously • LHC expts: ~65% need in 2004 >80% in 2007 Oxford eSc – 1st July’03
Disk Requirements • 60% of disk reqts in 2004 for LHC expts • 70% of disk storage in 2007 for LHC expts • Non-LHC expts still data taking – need disk for finishing analyses Oxford eSc – 1st July’03
Tape Requirments • Tape usage completely dominated by LHC usage – 90% • Large level uncertainty • 2007: ATLAS ( 850TB) vs CMS (1150TB) Oxford eSc – 1st July’03
Needs in 2004 Oxford eSc – 1st July’03
Application Development • Building on current collaborative activity • GANGA: ATLAS & LHCb • SAM: CDF & DØ • BaBar: adoption of EDG s/w • Prototyping Production environment • Monte Carlo production activity Analysis environment • Grid technologies becoming more widely accepted across HEP commuity • “old” experiments – UKDMC, ZEUS, … • “new” activities – LCFI, MICE, … Oxford eSc – 1st July’03
Application Development • Similar pattern of needs emerge from all experiments (not too suprisingly!) • Storage & location of data • Replication issues • Monte Carlo production tools • Seen as an obvious area for “efficiency” savings • Analysis interfaces • Intelligent bookkeeping of a user’s analysis activities • Persistency solutions • Composite objects spread across several storage systems Oxford eSc – 1st July’03
New Experiments Oxford eSc – 1st July’03
LHC experiments Oxford eSc – 1st July’03
Non LHC expts Oxford eSc – 1st July’03
Application Call • Essential continue to develop application interface through GridPP2 • Expand activity to allow current none GridPP supported expts to participate • Benefit from LCG developments • Call for application posts – January’04 • Response by April • Reviewed à la GridPP (“Williams” committee) • Expt activity in UK • Science o/p • Track record in Grid activity Oxford eSc – 1st July’03