1 / 32

CERN, evolution of resources

CERN, evolution of resources. Age distribution >5500 visitors, 2750 staff, >1200/year turnover (3.99) Excellent education. CERN paid (2750 staff, 800 fellows, paid associates) Unpaid visiting scientists - Users. Expenditure at 2000 prices. Contributions. aéroport. Genève. 1954 2000.

adelle
Download Presentation

CERN, evolution of resources

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CERN, evolution of resources Age distribution >5500 visitors, 2750 staff, >1200/year turnover (3.99) Excellent education CERN paid (2750 staff, 800 fellows, paid associates) Unpaid visiting scientists - Users Expenditure at 2000 prices Contributions H. F. Hoffmann, CERN-DG/DI

  2. aéroport Genève 1954 2000 CMS Alice LHCb PS Atlas H. F. Hoffmann, CERN-DG/DI

  3. Evolution of CERN Particle Accelerators • PS, 1959, ~1 GeV (cm) • ISR, 1971, ~9GeV (cm) • SPS, 1976,~4.5 GeV (cm) • SppbarS 1981 ~90 GeV (cm) • LEP, 1989, 80-209(?) GeV (cm) • LHC, 2005, ~2'000 GeV (cm) ("cm" center of mass of partons) s.c.cavity H. F. Hoffmann, CERN-DG/DI

  4. "Facilities" 81 cm Bubble Chamber PS 1962-68 Constructed in France SFM facility (magnet, vacuum chamber, SFMDetector) at the ISR, external detectors as specific triggers by external laboratories with help of CERN Big European Bubble Chamber PS and SPS 1974-1985 External muon identifier MWPC, 1968, SFMD: 300K wires 1974 DAQ, Trigger,large data rates H. F. Hoffmann, CERN-DG/DI

  5. Experimental Apparatus BEBC photo, v-beam, ~30 tracks, semi-automatic scanning, very sophisticated tracking and analysis codes-->computer literacy resolution ~ 100 µ, < 1 event/sec, 20m3 liquid superheated hydrogen, 1.5 Tesla S.C. magnet H. F. Hoffmann, CERN-DG/DI

  6. "Counter" Experiments M&G Fidecaro, 1964, spark chambers, --> N* experiment at ISR, 1972 Charm Search at ISR, 1975 ATLAS at LHC, 2005-2020 150*106 sensors; UA1 at SppbarS 1981-1989 ALEPH at LEP 1989-2000 H. F. Hoffmann, CERN-DG/DI

  7. H. F. Hoffmann, CERN-DG/DI

  8. Experimental Apparatus, continued Pixel detector 50*100µ/pixel, 140million channels • LHC collisions, 109 events/s • Complexity of data: ~250SPECint95*sec/event • 1 / 1013 selectivity General Purpose detector, Multiple, simultaneous detection modes, OO programming, millions of lines of code Basic building block in full pixel readout chip : 8 pixels/12 000 transistors in 400 by 425 mm2 H. F. Hoffmann, CERN-DG/DI

  9. ATLAS Collaboration 1800 physicists,150 institutes; 35 countries R&D, proposal, design, reviews, approval: 1988-1996 Construction, installation 1996-2005, operation 2005-2020 Material Cost 300 M Euro, CERN part:20% H. F. Hoffmann, CERN-DG/DI

  10. CERN's Network in the World 267 institutes in Europe, 4603 users 208 institutes elsewhere, 1632 users some points = several institutes H. F. Hoffmann, CERN-DG/DI

  11. Virtual Room Videoconferencing System • 3267 machines reg. • 1994 people in • 52 countries • 182 institutes • http://vrvs.cern.ch/ Bandwidth >256 Kbps --> >10 frames/sec H. F. Hoffmann, CERN-DG/DI

  12. Some typical features of such collaborations • Open, global collaboration of critical mass, able to deal with all problems posed, together with CERN and collaborating institutes • "Lean, bottom-up" self-organisation; success based on experienced collaborators, eager young people, common goals and competition • MoU, best intentions but not legally binding • Free choice of collaborating institutes to participate -or not • Clear common long-term mission, clear objectives, • Free exchange of ideas, technologies, R&D results • Often best people in the field of interest • External peer reviews; elaborate internal reviews and QA • Good record of achievements in terms of delivery to specs, schedules, budgets H. F. Hoffmann, CERN-DG/DI

  13. Basic Organisation of any Physics- , Scientific- Experiment • Organigram • Hierarchy - Heterarchy       . . . .     H. F. Hoffmann, CERN-DG/DI

  14. ATLAS ORGANIZATION Collaboration Board Resource Plenary Meeting (Chair : J.D. Dowell Review Deputy : M. Cavalli-Sforza) Board Spokesperson CB Chair (P. Jenni) Advisory Group (Deputy: T. Akesson) Resource Co-ordinator Technical Co-ordinator (P. Schmid) (M. Nessi) Executive Board Inner Detector Software, Gen. Members Magnet Trigger Tile Calorimeter (M. A. Parker Computing (P. Le Du L. Rossi (H.TenKate) (M. Nessi) (N. Ellis) (J. Knobloch) A. Zaitsev) M. Tyndel) Physics and LAr Calorimeter Muon DAQ Electronics (D. Fournier Spectrometer Detector Sim. (H. Williams) (L. Mapelli) D. Lissauer (G. Ciapetti (D. Froidevaux) C. Fabjan) H. Oberlack) H. F. Hoffmann, CERN-DG/DI

  15. H. F. Hoffmann, CERN-DG/DI

  16. CMS COLLABORATION RRB CMS-D 98-31 Memorandum of Understanding for Collaboration in the Construction of the CMS Detector between The EUROPEAN ORGANISATION FOR NUCLEAR RESEARCH, and an Institution/Funding Agency of the CMS Collaboration Preamble (a) A group of Institutes from CERN Member and non-Member States, and CERN,has agreed to collaborate to form the CMS Collaboration (Annex 1). This Collaboration has proposed to CERN an experiment to study particle interactions at the highest possible energies and luminosities to be reached with the Large Hadron Collider (LHC). These Institutes have secured the support of their Funding Agencies to enable them to participate in the CMS Collaboration. (b) Agreement to this Collaboration is effected through identical Memoranda of Understanding (hereafter referred to as MoU) between each Funding Agency or Institute, as appropriate, in the Collaboration and CERN, as the Host Laboratory. These MoUs define the Collaboration and its objectives, and the rights and obligations of the collaborating Institutes. (c) On the basis of a Technical Proposal submitted in December 1994 (CERN/ LHCC/94-38) and a detailed review of the scientific merits, the technological feasibility and estimates of the needed resources, the LHC Committee (LHCC) recommended approval of the experiment to the CERN Research Board, subject to a set of milestones to be met by the experiment in its initial phase (CERN/LHCC 95-76). (d) Based on the recommendation by the LHCC and in agreement with the list of milestones, the Research Board recommended to the Director General of CERN to approve the project, together with plans, including milestones, leading to the sub-detector Technical Design Reports. H. F. Hoffmann, CERN-DG/DI

  17. (e) The Director General accepted the Research Board recommendation and approved the project to build the detector for the CMS experiment within a cost ceiling not exceeding 475 MCHF (in 1995 prices). (f) Before proceeding to the final construction phase, each sub-detector (c.f.. Article 4.1) will be subjected to a technical, financial, and manpower review (CERN/DG/RB 95-234) by the LHCC based on the Technical Design Reports. This process will be completed during 1997 and 1998 for most of the sub-systems. (g) A Resources Review Board (RRB) has been constituted which comprises the representatives of all CMS Funding Agencies and the managements of CERN and the CMS Collaboration. It is chaired by the CERN Director of Research. The role of the RRB includes : · reaching agreement on the Memorandum of Understanding · monitoring the Common Projects and the use of the Common Funds · monitoring the general financial and manpower support · reaching agreement on a maintenance and operation procedure and monitoring its functioning · endorsing the annual construction and maintenance and operation budgets of the detector. The management of the Collaboration reports regularly to the RRB on technical, managerial, financial and administrative matters, and on the composition of the Collaboration. (h) Interim MoUs become obsolete (i) This MoU is not legally binding, but the Institutes and Funding Agencies recognize that the success of the Collaboration depends on all its members adhering to its provisions. Any default will be dealt with, in the first instance,by the Collaboration and if necessary then by the RRB. H. F. Hoffmann, CERN-DG/DI

  18. ATLAS Organisation (as example) • Principles: • Democracy • Separation of policy-making and executive powers • Minimal formal organisation • Limited terms of office • Plenary Meeting: • Forum of the all-hands discussions, • all major decisions concerning physics objectives and results, hardware and software design, organisational matters must be discussed there • Collaboration Board: • Policy- and decision-making body with typical tasks: • Decisions on global detector design • Policy matters wrt official bodies • Financial and human resources • Elections • Organisation and membership H. F. Hoffmann, CERN-DG/DI

  19. Project Organisation (example ATLAS) • ATLAS project=organised sum of sub-projects and parts, conceived, designed and fabricated to a variety of habits, standards, cultures around the world • Engineering organisation -- "top cultural layer", common language and common "rules of the game" to permit "engineering communication" throughout the project • Project breakdown structure (product, assembly breakdowns) • Work packages and WP-descriptions • Schedules, milestones, reporting for project follow up • Quality assurance • Reviews at the various project stages like design, construction, assembly • Configuration management • Integration, mechanical, services, "environmental", accelerator • Safety (CERN-LHCC/99-01; ATLAS TDR 13, "Tech. Co-ordination"; 31-01-1999, 598 pages) H. F. Hoffmann, CERN-DG/DI

  20. From Particles to Petabytes:Challenges in High Throughput Computing Example:(Data-) Grid,(EU-Project, NSF Project,..) a global Particle Physics Projectto make world-wide LHC computing possible H. F. Hoffmann, CERN-DG/DI

  21. On-line System • Multi-level trigger • Filter out less interesting • Reduce data volume • 24 x 7 operation 40 MHz(1000 TB/sec) equivalent) Level 1 - Special Hardware 75 KHz(75 GB/sec)fully digitised Level 2 - Embedded Processors 5 KHz(5 GB/sec) Level 3 – Farm of commodity CPU Digital telephone 1-2 KB/sec 100 Hz(100 MB/sec) Data Recording & Offline Analysis H. F. Hoffmann, CERN-DG/DI

  22. How Much Data is Involved? High Level-1 Trigger(1 MHz) High No. ChannelsHigh Bandwidth(500 Gbit/s) Level 1 Rate (Hz) 106 1 billion people surfing the Web LHCB ATLAS CMS 105 HERA-B KLOE CDF II 104 High Data Archive(PetaByte) CDF 103 H1ZEUS ALICE NA49 UA1 102 104 105 106 107 LEP Event Size (bytes) H. F. Hoffmann, CERN-DG/DI

  23. Complex Queries = More CPU Per Byte Evolution of Computing Capacity ? 1'000 900 800 700 LHC Thousands ofSPECint 95 600 500 CERN 1999:3.5K SI95900 CPUs 400 COMPASS 300 Others 200 100 0 1997 1998 1999 2000 2001 2002 2003 2004 2005 Year H. F. Hoffmann, CERN-DG/DI

  24. CERN Computer Center Today… --> Commodity H. F. Hoffmann, CERN-DG/DI

  25. Therefore, our natural affinity has shifted from supercomputerstowards ISPs, e-commerce and data marketers CERN Computer Center Today… • No longer aligned with supercomputing philosophies • Require many small independent problem solutions • “High Throughput Computing”processing “click-like” interactions in parallel • A marriage of supercomputer storage systems with supermarket commodity CPU • Disk access layer (hw+sw) sandwiched between • “Middle-ware” on network layer important H. F. Hoffmann, CERN-DG/DI

  26. Grids: Next Generation Web http:// Web: Uniform access to HTML documents http:// Software catalogs Sensor nets Grid: Flexible, high-performance access to all significant resources Computers Data Stores Colleagues Web-sites On-demand creation of powerfulvirtual computing and data systems H. F. Hoffmann, CERN-DG/DI

  27. Tier2 Center Tier2 Center Tier2 Center Tier2 Center Tier2 Center HPSS HPSS HPSS HPSS LHC Vision: Data Grid Hierarchy Bunch crossing per 25 nsecs; 100 triggers per second. Event is ~1 MByte in size ~PByte/sec ~100 MBytes/sec Online System Experiment Offline Farm,CERN Computer Ctr > 20 TIPS Tier 0 +1 HPSS ~0.6-2.5 Gbits/sec Tier 1 FNAL Center Italy Center UK Center German Centre ~2.5 Gbits/sec Tier 2 ~622 Mbits/sec Tier 3 Institute ~0.25TIPS Institute Institute Institute Physicists work on analysis “channels” Each institute has ~10 physicists working on one or more channels 100 - 1000 Mbits/sec Physics data cache Tier 4 Workstations H. F. Hoffmann, CERN-DG/DI

  28. The Grid Middleware Services Concept • Standard services that • Provide uniform, high-level access to a wide range of resources (including networks) • Address interdomain issues: security, policy • Permit application-level management and monitoring of end-to-end performance • Broadly deployed, like Internet Protocols • Enabler of application-specific tools as well as applications themselves H. F. Hoffmann, CERN-DG/DI

  29. GEANT, necessary infrastructure • Minimum bandwidth of 2.5 Gbps between core nodes, possibility of starting with some 10Gbps (STM-64/OC-768c) circuits is not excluded. • Connection to other World Regions in principle via core nodes only, They will, together, form a European Distributed Access (EDA) “point” conceptually similar to the STAR TAP. H. F. Hoffmann, CERN-DG/DI

  30. The Web, a historical case study • Invented at CERN in 1989 as application layer on top of the internet infrastructure • Development started in Europe (small) and US (big, >50 computer scientists initially for MOSAIC) • 80% of the most visited sites: US, <10% Europe Web Site Servers H. F. Hoffmann, CERN-DG/DI

  31. Q1:Flexibility of institutional structure to allow researchers to change field Q2: Exchange of knowledge across disciplines and institutions Q3: Main obstacles to international co-operation Q4:Information revolution A1:Beginning discussion with astro-, space- physics; technology transfer-> funding for interdisciplinary activities? However,very clear mission, "mono-culture" A2: More fellowships in technological and interdisciplinary fields with specific funding: domain competence and add-on competence(KI, DataGrid) A3: Funding agencies, scientists, politicians still think "national" Employment conditions, spouses, schools--> keep national employment in Europe "in exchange" plus adjustment allowance, help for spouses to find appropriate work, international schools A4: Promote e-science, grids, . . . Questions and Answers H. F. Hoffmann, CERN-DG/DI

  32. International Collaboration Objective: top scientific excellence in your particular field Add resources ("Resources": Talent; particular knowledge, experience, methods; specific scientific apparatus, technologies; funds) Create a complete, competitive, technological infrastructure in your particular field beyond local, regional, national means Create a network of competence to solve detailed problems quickly and to prepare new means Base your collaboration to a large degree on Universities (talent)  Reach, sustain excellence by attracting the best people "Nobody is perfect" - but a team can be H. F. Hoffmann, CERN-DG/DI

More Related