1 / 27

Na tional Re search G rid I nitiative (NAREGI)

The Information Technology Based Lab (ITBL) facilitates collaborative engineering and distributed supercomputing for large-scale computational science simulations. Powered by Super SINET, ITBL supports aerospace integrated simulations with a problem-solving environment and high-speed data processing. Join the National Research Grid Initiative (NAREGI) to access cutting-edge grid technology for innovative research projects in Japan and beyond.

enagold
Download Presentation

Na tional Re search G rid I nitiative (NAREGI)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National Research Grid Initiative (NAREGI) Project Leader, NAREGI Project Professor, National Institute of Informatics Fellow, Fujitsu Laboratories Limited Kenichi Miura, Ph.D. December 9, 2003

  2. Information Technology Based Lab (ITBL) Super-SINET(NII) VizGrid (Prof. Matsuzawa, JAIST) BioGrid (Prof. Shimojo, Osaka-U) Campus Grid(Prof. Matsuoka, Titech) National Research Grid Initiative (NAREGI) Grid Technology Research Center(Mr. Sekiguchi) Japan Virtual Observatory (JVO) Grid Related Projects in Japan

  3. Information Technology Based Laboratory (ITBL) Example. 1: Material Design Simulation DB First Principle Calc. MD Simulation Engine Noise (CFD-Aero Acoustics) Separation (CFD-Flight Dynamics) The goal of ITBL is to support virtual laboratory for large-scale computational science and engineering simulations, by facilitating interactions among heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. Flight Stability (CFD-Control) Aerodynamic Heating (CFD-Thermal Structure) Reusable Space Transportation Vehicle Airframe-Propulsion Interference (CFD-Chemical Reaction) Flutter (CFD-Structure) Visualization Example. 2: Aerospace Integrated Simulation ITBL provides: ・ Problem Solving Environment ・Distributed Supercomputing ・Collaborative Engineering ・High Speed Handling of Massive Data Super SINET NOC Super SINET SINET NOC ITBL is one of e-Japan National Priority Programs SINET

  4. Computational Resources of ITBL レーザー解析 Laser Analysis アドベンチャー(次世代構造解析) ADVENTURE (Large Scale Finite Element Analysis Package) SPEEDI/ITBL(数値環境) Numerical Environment System タンパク質構造解析 Information Sharing System for Bioinformatics TUNAMI(津波予測) Tsunami Prediction 接続サイト Sites Connected ITBL/TOMBO(ナノ・シミュレーション) Nano Simulation 宇宙航空研究開発機構 Japan Aerospace Exploration Agency 防災科学技術研究所 National Research Institute for Earth Science and Disaster Prevention 日本原子力研究所 (東海研究所・那珂研究所・関西研究所・上野本部) Japan Atomic Energy Research Institute (Tokai Research Establishment, Naka Fusion Research Establishment, Kansai Research Establishment, Center for Promotion of Computational Science and Engineering) 理化学研究所 RIKEN 東京大学生産技術研究所 Institute of Industrial Science University of Tokyo 東北大学 (金属材料研究所・流体科学研究所・災害制御センター) Tohoku University(Institute for Materials Research, Institute of Fluid Science, Disaster Control Research Center) 京都大学島崎研究室 Kyoto University Shimazaki Laboratory 九州大学松尾研究室 Kyushu University Matsuo Laboratory 北陸先端科学技術大学院大学 Japan Advanced Institute of Science and Technology 核破砕水銀ターゲット(大強度陽子線加速器) Nuclear Spallation Mercury Target (High Intensity Proton Accelerator) SX-6i PC cluster SV1ex Super SINET (10Gbps) SINET(1Gbps) SR8000 VPP5000 Prime Power SX-6 pSeries690 SR2201 Atrix 3800 T3E-1200E Origin3200 Origin3800 SC/ES40

  5. SuperSINET: All Optical Production Research Network (separate funding) ■ 10Gbps Photonic Backbone ■ GbEther Bridges for peer-connection ■ Very low latency – Titech-Tsukuba 3-4ms roundtrip ■ Operation of Photonic Cross Connect (OXC) for fiber/wavelength switching ■ 6,000+km dark fiber, 100+ e-e lambda and 300+Gb/s ■ Operational since January, 2002 Hokkaido U. Nano-Technology For GRID Application OC-48+ transmission for Radio Telescope DataGRID for High-energy Science Tohoku U. NIFS Kyoto U. NAO Computational GRID And NAREGI Waseda U. Bio-Informatics KEK Osaka U. Tsukuba U. NIIOperation U. of Tokyo Kyushu U. Doshidha U. Tokyo Institute of Tech. Nagoya U. NII R&D ISAS Okazaki Research Institutes NIG

  6. Network Topology Map of SuperSINET

  7. National Research Grid Initiative (NAREGI) Project:Overview • A new R&D project funded by MEXT (FY2003-FY2007) • ~17M$ budget in FY2003 • One of Japanese Government’s Grid Computing Projects • Collaboration of National Labs. Universities and Industry • in the R&D activities (IT and Nano-science Apps.) • - Acquisition of Computer Resources underway (FY2003) MEXT:Ministry of Education, Culture, Sports,Science and Technology

  8. National Research Grid Initiative (NAREGI) Project:Goals (1) To develop a Grid Software System (R&D in Grid Middleware and Upper Layer) as the prototype for future Grid Infrastructure in scientific research in Japan (2) To provide a Testbed to prove that the High-end Grid Computing Environment (100+Tflop/s expected by 2007) can be practically utilized in the Nano-science Simulations over the Super SINET. (3) To Participate in International Collaboration (U.S., Europe, Asian Pacific) (4) To Contribute to Standardization Activities, e.g., GGF

  9. National Institute of Informatics (NII) (Center for Grid Research & Development) Institute for Molecular Science (IMS) (Computational Nano‐science Center) Universities and National Laboratories(Joint R&D) (AIST, Titech, Osaka-u, Kyushu-u, Kyushu Inst. Tech., Utsunomiya-u, etc.) Research Collaboration (ITBL Project, National Supecomputing Centers etc.) Participating Vendors (IT and Chemicals/Materials) Consortium for Promotion of Grid Applications in Industry Participating Organizations

  10. NAREGI Research Organization and Collaboration MEXT Center for Grid Research & Development (National Institute of Informatics) Grid R&D Advisory Board National Supercomputeing Centers Project Leader (K.Miura, NII) Coordination in Network Research Grid R&D Progam Management Committee Grid Middleware and Upper Layer R&D AIST (GTRC) Grid Networking R&D SuperSINET Joint Research Group Leaders Group Leader Technical Requirements National Supercomputing Centers Network Technology Refinement R&D Operations Coordination/ Deployment R&D Utilization of Network R&D Technology Dev. Universities Research Labs. Joint Research (Titech,Osaka-U, Kyushu-U. etc)) Computational Nano-science Center (Institute for Molecular Science) Utilization of Computing Resources ITBLProject (JAERI) Computer Resources (Acquisition in FY2003) NII:~5Tflop/s IMS:~10Tflop/s Nano-science Applicatons Director(Dr. Hirata, IMS) ITBLProject Dir. Operations Operations R&D R&D Joint Research Consortium for Promotion of Grid Applications in Industry R&D of Grand-challenge Grid Applocations (ISSP,Tohoku-u,,AIST etc., Industrial Partners)

  11. WP6: Grid-Enabled Apps WP3: Grid Visualization WP3: Grid PSE WP3: Grid Workflow WP2: Grid Programming-Grid RPC -Grid MPI WP4: Packaging WP1: Grid Monitoring & Accounting WP1: SuperScheduler (Globus,Condor,UNICOREOGSA) WP1: Grid VM WP5: High-Performance & Secure Grid Networking NAREGI Software Stack

  12. WP-1: Lower and Middle-Tier Middleware for Resource Management: Matsuoka (Titech), Kohno(ECU), Aida (Titech) WP-2: Grid Programming Middleware: Sekiguchi(AIST), Ishikawa(AIST) WP-3: User-Level Grid Tools & PSE: Miura (NII), Sato(Tsukuba-u), Kawata(Utsunomiya-u) WP-4: Packaging and Configuration Management: Miura (NII) WP-5: Networking, Security & User Management Shimojo (Osaka-u), Oie ( Kyushu Tech.) WP-6: Grid-enabling tools for Nanoscience Applications : Aoyagi (Kyushu-u) R&D in Grid Software and Networking Area (Work Packages)

  13. Unicore  Condor  Globus Interoperability - Adoption of ClassAds Framework Meta-scheduler - Scheduling Schema - Workflow Engine Auditing and Accounting - Attaches to multiple monitoring framework - User and job auditing - CIM-based node information schema - Accounting based on user/job audit WP-1: Lower and Middle-Tier Middleware for Resource Management

  14. Self-Configurable Monitoring Grid Self-Configuration Management (including packaging) GridVM( Lightweight Grid Virtual Machine) - Support for Co-scheduling - Fine-Grained Resource Control - Node (IP) virtualization - Interfacing with OGSA WP-1: Lower and Middle-Tier Middleware for Resource Management(Continued)

  15. WP-2:Grid Programming – GridRPC/Ninf-G2 (AIST/GTRC) GridRPC • Programming Model using RPC on the Grid • High-level, taylored for Scientific Computing (c.f. SOAP-RPC) • GridRPC API standardization by GGF GridRPC WG • Ninf-G Version 2 • A reference implementation of GridRPC API • Implemented on top of Globus Toolkit 2.0 (3.0 experimental) • Provides C and Java APIs Numerical Library IDL FILE IDL Compiler Client 4. connect back 3. invoke Executable generate 2. interface reply Remote Executable GRAM 1. interface request fork Interface Information LDIF File MDS retrieve Client side Server side

  16. MPI Core IMPI RIM Grid ADI Latency-aware Communication Topology Other Comm. Library Vendor MPI RSH GRAM SSH P-to-P Communication Vendor MPI TCP/IP PMv2 Others WP-2:Grid Programming-GridMPI (AIST and U-Tokyo) GridMPI • Provides users an environment to run MPI applications efficiently in the Grid. • Flexible and hterogeneous process invocation on each compute node • GridADI and Latency-aware communication topology, optimizing communication over non-uniform latency and hides the difference of various lower-level communication libraries. • Extremely efficient implementation based on MPI on Score (Not MPICHI-PM)

  17. Grid Workflow - Workflow Language Definition - GUI(Task Flow Representation) Visualization Tools - Real-time volume visualization on the Grid PSE /Portals - Multiphysics/Coupled Simulation - Application Pool - Collaboration with Nano-science Applicatons Group Problem Solving Environment PSE Portal PSE Toolkit PSE Appli-pool Information Service Workflow Application Server Super-Scheduler WP-3: User-Level Grid Tools & PSE

  18. Collaboration with WP1 management Issues Selection of packagers to use (RPM, GPT?) Interface with autonomous configuration management (WP1) Test Procedure and Harness Testing Infrastructure c.f. NSF NMI packaging and testing WP-4: Packaging and Configuration Management

  19. Traffic measurement on SuperSINET Optimal QoS Routing based on user policies and network measurements Robust TCP/IP Control for Grids Grid CA/User Grid Account Management and Deployment Grid Application Grid Application Grid Application Super-scheduler User Policy Information DB Network Information DB Grid Network Management Server Measurement Entity Dynamic bandwidth Control and QoS routing Multi-Points real-time measurement High-speed managed networks Network Control Entity Network Control Entity Measurement Entity WP-5: Network Measurement, Management & Control for Grid Environment

  20. Analysis of Typical Nanoscience Applications - Parallel Structure - Granularity - Resource Requirement - Latency Tolerance Coupled Simulation (e.g.,FMO & RISM) Collaboration with IMS RISM FMO Solvent distribution Solute structure Mediator Mediator In-sphere correlation Cluster (Grid) SMP SC WP-6:Adaptation of Nano-science Applications to Grid Environment • RISM: Reference Interaction Site Model • FMO: Fragment Molecular Orbital Method

  21. Development Concept ・・ ・・ ・・ ・・ Development of Operational Environment Integration Integration Integration Testbed for Prototyping (UNICORE, Condor,Globus) Final System (OGSA Compliant) Grid Middleware Research

  22. Nano-science and Technology Applications Targeted • Participating Organizations: • -Institute for Molecular Science • Institute for Solid State Physics • AIST • Tohoku University • Kyoto University • Industry (Materials, Nano-scale Devices) • Consortium for Promotion of Grid Applications in Industry • Research Topics and Groups: • Electronic Structure • Magnetic Properties • Functional nano-molecules(CNT,Fullerene etc.) • Bio-molecules and Molecular Electronics • Simulation Software Integration Platform • Etc.

  23. ~3000 Procs, ~17TFlops The NAREGI Phase 1 Testbed ($45mil, 1Q2004) AIST SuperCluster ~11TFlops Titech Campus Grid Small Test App Clusters (x 6) Osaka-U BioGrid Small Test App Clusters U-Tokyo SuperSINET (10Gbps MPLS) Note: NOT a production Grid system (c.f. TeraGrid) ~400km NII(Tokyo) IMS(Okazaki) Computational Nano-science Center ~10TFlops Application Testbed Center for Grid R&D ~ 5Tflops Software Testbed

  24. Computer System for Grid Software Infrastructure R&D (at NII) (5 Tflops, 700GB)

  25. Computer System for Grid Software Infrastructure R&D (at IMS) (10 Tflops, 5TB)

  26. NII Center for Grid R&D (Jimbo-cho, Tokyo) Mitsui Office Bldg. 14th Floor Akihabara Imperial Palace Tokyo St. 700m2 office space (100m2 machine room)

  27. We regard Grid as one of the fundamental technologies of the IT infrastructure in 21st century In the NAREGI project, seamless federation of heterogeneous resources is the primary objective Computations in Nano-science/technology applications over Grid is to be promoted, including industrial participation. International Co-operation is essential. Summary

More Related