390 likes | 498 Views
LCG Applications Area. Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications DOE/NSF Review of US LHC Physics and Computing Projects January 14, 2003. RTAG. WP. WP. WP. WP. WP. The LHC Computing Grid Project Structure. Project Overview Board.
E N D
LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications DOE/NSF Review of US LHC Physics and Computing Projects January 14, 2003
RTAG WP WP WP WP WP The LHC Computing Grid Project Structure Project Overview Board Software andComputingCommittee(SC2) Project Leader ProjectExecutionBoard (PEB) Requirements, Work plan, Monitoring GridProjects Project Work Packages
LCG Areas of Work Physics Applications Software • Application Software Infrastructure – libraries, tools • Object persistency, data management tools • Common Frameworks – Simulation, Analysis, .. • Adaptation of Physics Applications to Grid environment • Grid tools, Portals Grid Deployment • Data Challenges • Grid Operations • Network Planning • Regional Centre Coordination • Security & access policy Fabric (Computing System) • Physics Data Management • Fabric Management • Physics Data Storage • LAN Management • Wide-area Networking • Security • Internet Services Grid Technology • Grid middleware • Standard application services layer • Inter-project coherence/compatibility
Applications Area Organization Apps Area Leader Architects Forum Overall management, coordination, architecture Project Project Project Project Leaders … Work Package Leaders WP WP WP WP WP WP WP Direct technical collaboration between experiment participants, IT, EP, ROOT, LCG personnel
Focus on Experiment Need • Project structured and managed to ensure a focus on real experiment needs • SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves • Architects Forum to involve experiment architects in day to day project management and execution • Open-ness of information flow and decision making • Direct participation of experiment developers in the projects • Tight, iterative feedback loop to gather user feedback from frequent releases • Early deployment and evaluation of LCG software in experiment contexts • Success defined by experiment adoption and production deployment
Applications Area Projects • Software Process and Infrastructure (SPI) (operating – A.Aimar) • Librarian, QA, testing, developer tools, documentation, training, … • Persistency Framework (POOL) (operating – D.Duellmann) • POOL hybrid ROOT/relational data store • Mathematical libraries (operating – F.James) • Math and statistics libraries; GSL etc. as NAGC replacement • Group in India will work on this (workplan in development) • Core Tools and Services (SEAL) (operating – P.Mato) • Foundation and utility libraries, basic framework services, system services, object dictionary and whiteboard, grid enabled services • Physics Interfaces (PI) (launched – V.Innocente) • Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals • Simulation (launch planning in progress) • Geant4, FLUKA, simulation framework, geometry model, … • Generator Services (launch as part of simu) • Generator librarian, support, tool development Bold: Recent developments (last 3 months)
Other LCG Projects in other Areas LCG Applications Area Persistency (POOL) Physicists Interface (PI) Math Libraries LHC Experiments Software Process & Infrastructure (SPI) … Core Libraries & Services (SEAL) Project Relationships
Candidate RTAG timeline from March Blue: RTAG/activity launched or (light blue) imminent
Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 2002 2003 2004 2005 LCG Applications Area Timeline Highlights POOL V0.1 internal release Architectural blueprint complete Applications Hybrid Event Store available for general users Distributed production using grid services Distributed end-user interactive analysis Full Persistency Framework LCG TDR LCG “50% prototype” (LCG-3) LCG-1 reliability and performance targets First Global Grid Service (LCG-1) available LCG launch week
Architecture Blueprint • Executive summary • Response of the RTAG to the mandate • Blueprint scope • Requirements • Use of ROOT • Blueprint architecture design precepts • High level architectural issues, approaches • Blueprint architectural elements • Specific architectural elements, suggested patterns, examples • Domain decomposition • Schedule and resources • Recommendations RTAG established in June After 14 meetings, much email... A 36-page final report Accepted by SC2 October 11 http://lcgapp.cern.ch/project/blueprint/
Component Model • Granularity driven by component replacement criteria; development team organization; dependency minimization • Communication via public interfaces • Plug-ins • Logical module encapsulating a service that can be loaded, activated and unloaded at run time • APIs targeted not only to end-users but to embedding frameworks and internal plug-ins
Applications Reconstruction Framework Visualization Framework Simulation Framework OtherFrameworks . . . Basic Framework Foundation Libraries Optional Libraries Software Structure ROOT, Qt, … Implementation- neutral services Grid middleware, … STL, ROOT libs, CLHEP, Boost, …
Distributed Operation • Architecture should enable but not require the use of distributed resources via the Grid • Configuration and control of Grid-based operation via dedicated services • Making use of optional grid middleware services at the foundation level of the software structure • Insulating higher level software from the middleware • Supporting replaceability • Apart from these services, Grid-based operation should be largely transparent • Services should gracefully adapt to ‘unplugged’ environments • Transition to ‘local operation’ modes, or fail informatively
Managing Objects • Object Dictionary • To query a class about its internal structure • Essential for persistency, data browsing, etc. • The ROOT team and LCG plan to develop and converge on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1yr?) • Will contact Stroustrup, who has started implementation • Object Whiteboard • Uniform access to application-defined transient objects, including in the ROOT environment • What this will be (how similar to Gaudi, StoreGate?) not yet defined • Object definition based on C++ header files • Now that ATLAS as well as CMS will use this approach, it is being addressed in a common way via the LCG AA
Dictionary: Reflection / Population / Conversion New in POOL 0.3 In progress
Other Architectural Elements • Python-based Component Bus • Plug-in integration of components providing a wide variety of functionality • Component interfaces to bus derived from their C++ interfaces • Scripting Languages • Python and CINT (ROOT) to both be available • Access to objects via object whiteboard in these environments • Interface to the Grid • Must support convenient, efficient configuration of computing elements with all needed components
Domain Decomposition Products mentioned are examples; not a comprehensive list Grey: not in common project scope (also event processing framework, TDAQ)
Use of ROOT in LCG Software • Among the LHC experiments • ALICE has based its applications directly on ROOT • The 3 others base their applications on components with implementation-independent interfaces • Look for software that can be encapsulated into these components • All experiments agree that ROOT is an important element of LHC software • Leverage existing software effectively and do not unnecessarily reinvent wheels • Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT • LCG AA software will make use of ROOT as an external product • Draws on a great ROOT strength: users are listened to very carefully! • So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL
Blueprint RTAG Outcomes • SC2 decided in October… • Blueprint is accepted • RTAG recommendations accepted to • Start common project on core tools and services • Start common project on physics interfaces
Applications Area Personnel Status • 18 LCG apps hires in place and working; +2 in Jan, Feb • Manpower ramp is on target (expected to reach 20-23) • Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US • ~10 FTEs from IT (DB and API groups) also participating • ~8 FTEs from experiments (CERN EP and outside CERN) also participating in (mainly) POOL, SEAL, SPI • CERN established a new software group as the EP home of the LCG applications area (EP/SFT) • Led by John Harvey. Taking shape well. Localized in B.32 • Fraction of experiment contribution which is US-supported (CERN or US resident) is currently ~30% • US fraction of total effort is <10%
LHC Manpower needs for Core Software From LHC Computing (‘Hoffman’) Review (FTEs) Only computing professionals counted
SPI Math libraries Physics interfaces Generator services Simulation CoreToolsS&Services POOL Personnel Resources – Required and Available Estimate of Required Effort 60 50 Now 40 FTEs 30 20 10 0 Jun-03 Jun-04 Mar-03 Mar-04 Mar-05 Dec-03 Sep-04 Dec-04 Dec-02 Sep-03 Sep-02 Quarter ending Blue = Available effort: FTEs today: 18 LCG, 10 CERN IT, 8 CERN EP + experiments Future estimate: 20-23 LCG, 13 IT, 28 EP + experiments
U.S. Leadership • Direct leadership and financial contribution: T.Wenaus as AA manager • In addition to contributions via ATLAS and CMS • A .75FTE job requiring CERN residence • Salary support from the BNL base program (is this fair?) • CERN residency and US travel costs borne by CERN • Together with the strong U.S. presence in CMS and ATLAS computing leadership, this role gives the U.S. a strong voice in the LCG applications area • Not a dominating influence of course; e.g. at this point all the applications area project leaders are Europeans • Presence at CERN is very important, like it or not • Importance is increased because of the utterly deplorable state of the CERN infrastructure for both audio and video conferencing • The U.S. should put up the money to fix this, if no one else will; it is in our own vital interest
Apps area planning materials • Planning page linked from applications area page • Applications area plan spreadsheet: overall project plan • http://lcgapp.cern.ch/project/mgmt/AppPlan.xls • High level schedule, personnel resource requirements • Applications area plan document: overall project plan • http://lcgapp.cern.ch/project/mgmt/AppPlan.doc • Incomplete draft • Personnel spreadsheet • http://lcgapp.cern.ch/project/mgmt/AppManpower.xls • Currently active/foreseen apps area personnel, activities • WBS, milestones, assigned personnel resources • http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html Follow Applications Area planning link on the review web page
Core Libraries and Services (SEAL) Project • Launched in Oct, led by Pere Mato (CERN/LHCb) • 6-member (~3 FTE) team initially; M.Marino from ATLAS • Scope: • Foundation, utility libraries • Basic framework services • Object dictionary • Grid enabled services • Many areas of immediate relevance to POOL; these are given priority • Users of this project are software developers in other projects and the experiments • Establishing initial plan, reviewing existing libraries and services • Process for adopting third party code will be addressed in this project • Initial workplan will be presented to SC2 on Jan 10 • 2003/3/31: SEAL V1 essentials in alpha
SEAL Work Packages • Foundation and utility libraries • Boost, CLHEP, …, complementary in-house development • Component model and plug-in manager • The core expression in code of the component architecture described in the blueprint. Mainly in-house development. • LCG object dictionary • Already active project in POOL; being moved to SEAL (wider scope than persistency). Will include filling dictionary from C++ header files. • Basic framework services • Object whiteboard, message reporting, component configuration, ‘event’ management • Scripting services • Grid services: common interface to middleware • Education and documentation • Assisting experiments with integration
Physicist Interface (PI) Project • Led by Vincenzo Innocente (CERN/CMS) • Covers the interfaces and tools by which physicists will directly use the software • Planned scope: • Interactive environment: physicist’s desktop • Analysis tools • Visualization • Distributed analysis, grid portals • Very poorly defined and understood • Currently surveying experiments on their needs and interests • In more of an ‘RTAG mode’ than project mode initially, to flesh out plans and try to clarify the grid area • Will present initial plans (and possibly an analysis RTAG proposal) to SC2 on Jan 29
Software Process and Infrastructure (SPI) Components available: • Code documentation, browsing Doxygen, LXR, ViewCVS • Testing Framework CppUnit, Oval • Memory Leaks Valgrind • Automatic Builds Probably the ATLAS system • Coding and design guidelines RuleChecker • CVS organization • Configuration/release mgmt SCRAM • Software documentation templates http://spi.cern.ch
SPI Services • CVS repositories • One repository per project • Standard repository structure and #include conventions being finalized this week • Will eventually move to IT CVS service when it is proven • AFS delivery area, Software Library • /afs/cern.ch/sw/lcg • Installations of LCG-developed and external software • Installation kits for offsite installation • LCG Software Library ‘toolsmith’ started in December • Build servers • Machines with various Linux, Solaris configurations available for use • Project portal (similar to SourceForge) http://lcgappdev.cern.ch • Very nice new system using Savannah (savannah.gnu.org) • Used by CMS as well as LCG; ATLAS will probably be using it soon • Bug tracking, project news, FAQ, mailing lists, download area, CVS access, …
POOL • Pool of persistent objects for LHC, currently in prototype • Targeted at event data but not excluding other data • Hybrid technology approach • Object level data storage using file-based object store (ROOT) • RDBMS for meta data: file catalogs, object collections, etc (MySQL) • Leverages existing ROOT I/O technology and adds value • Transparent cross-file and cross-technology object navigation • RDBMS integration • Integration with Grid technology (eg EDG/Globus replica catalog) • network and grid decoupled working modes • Follows and exemplifies the LCG blueprint approach • Components with well defined responsibilities • Communicating via public component interfaces • Implementation technology neutral
Pool Release Schedule • End September - V0.1 (Released Oct 2) • All core components for navigation exist and interoperate • Assumes ROOT object (TObject) on read and write • End October - V0.2 (Released Nov 15) • First collection implementation • End November - V0.3 (Released Dec 18) • First public release • EDG/Globus FileCatalog integrated • Persistency for general C++ classes (not instrumented by ROOT), but very limited: elementary types only • Event metadata annotation and query • June 2003 – Production release
Simulation Project • Mandated by SC2 to initiate simulation project following the RTAG • Project being organized now • Expected to cover • generic simulation framework • Multiple simulation engine support, geometry model, generator interface, MC truth, user actions, user interfaces, average tracking, utilities • ALICE virtual MC as starting point if it meets requirements • Geant4 development and integration • FLUKA (development and) integration • physics validation • simulation test and benchmark suite • fast (shower) parameterisation • generator services
Comment on Grid Technology Area (GTA) Quote from slide of Les: LCG expects to obtain Grid Technology from projects funded by national and regional e-science initiatives -- and from industry concentrating ourselves on deploying a global grid service All true, but there is a real role for the GTA, not just deployment, in LCG: Ensuring that the needed middleware is/will be there, tested, selected and of production grade (Re)organization in progress to create an active GTA along these lines Important for the Applications Area: AA distributed software will be robust and usable only if the grid middleware it uses is so
Concluding Remarks • Essentially the full expected AA scope is covered by the anticipated activities of the projects now defined • Manpower is in quite good shape • Buy-in by the experiments, apart from ALICE, is good • Substantial direct participation in leadership, development, prompt testing and evaluation, RTAGs • U.S. CMS represented well because of strong presence in computing management and in CERN-based personnel • U.S. ATLAS representation will improve with D.Quarrie’s relocation to CERN as Software Leader; further increases in CERN presence being sought • Groups remote from CERN are contributing, but it isn’t always easy • Have pushed to lower the barriers, but still it isn’t easy • New CERN EP/SFT group is taking shape well as a CERN hub for applications area activities • POOL and SPI are delivering, and the other projects are ramping up • First persistency prototype released in 2002, as targeted in March 2002