570 likes | 585 Views
Explore the establishment of uMeteo-K, a Korea-based meteorological research cooperation system utilizing grid technology. Learn about the virtual lab concept, parallelized modeling, and virtual data servers. Discover the AG operation, hardware specifications, and achievements of uMeteo-K. Gain insights into the operation differences between AG 1.2 and AG 2.0, including enhanced features and benefits. Experience engaging seminars, tutorials, and meetings in the meteorology field. Costs and benefits analysis provided.
E N D
17th APAN Meetings Research Collaborations Among InstitutionsEmploying Grid Technology January 29, 2004 Hawaii Imin International Conference CenterUniversity of Hawaii, Honolulu, Hawaii, USA Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee (NCAM/KMA, KOREA)
Main Goals Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system
About uMeteo-K The concept of virtual lab. for interdisciplinary meteorological researches • Cooperation meteorological research environment (Access Grid) • - Parallelized numerical weather/climate prediction modeling (Computational Grid) • - Virtual server for large meteorological data (Data Grid) Grid technology is essential to accomplishment
Establishment of virtual cooperation research environment using Access Grid Pilot operation of numerical weather prediction system under Computational Grid environment Establishment of data grid, application of virtual scenarios for uMeteo-K uMeteo-K Achievement(Apr., 2003 - present)
CPUs Video device-1 RAM Video device-2 ODD Animation capture device Audio device FDD Mother Board SCSI-FDD PKNU room node AG server spec.
PKNU QuickBrigde server installed • bridge.pknu.ac.kr: uMeteo-K bridge server • - Supporting to connect PIG without Multicasting network • - Operating uMeteo-K homepage server • - Registered uMeteo-K virtual venues service (ANL) • Operating the Quick bridge server (bridge.pknu.ac.kr :210.107.209.219)
uMeteo-K AG instruments - Camera : EVI-D30 camera - Mic. : cordless & boundary mic. - Echo remove unit
uMeteo-K new AG configuration Multicast APAG ANL Unicast KMA Meteorological University KMA SNU KISTI KISTI Unicast KMA Meteorological Organization AG KJIST KAIST Unicast KMA Other Users PKNU PKNU(부경대) KMA KMA Quick Bridge
Remote PPT - Sharing presentation data and connecting with AG - The data sharing through simple virtual host configuration
VNC Server & Viewer • VNC stands for Virtual Network Computing. • VNC allows you to view and interact with one computer (the "server") using a simple program (the "viewer") on another computer anywhere. • - The two computers don't even have to be the same type OS.
Samples of uMeteo-K AG operation Seminar : international (4) + domestic (14) Co-work and general use : 60
Meteorology Session on 16th APAN meeting Marriot Hotel, Busan, Korea <Aug. 26~27, 2003>
2003 Grid Forum Korea Winter meetings at Chosun Hotel, Seoul <Dec. 1~2, 2003>
Tera Computing and Network Demonstration, KISTI <Dec. 23, 2003>
Workshop on Grid Technology for Environmental study at PKNU <Jan. 8~10, 2004>
uMeteo-K AG 2.0 tutorials uMeteo-K meeting & tutorials at PKNU, Busan, Korea <Jan. 8~10, 2004>
Costs & Benefits Analysis for uMeteo-K AG 국내 전문가1인=100,000원, 국외전문가1인=1,000,000원, 수행연구원1인=5만원
Differences between AG 1.2 and AG 2.0 1. Easy Venue Server configuration and usage 2. Instead of PIG, using individual Venue Server 3. Complexity about Certification Authority 4. Sharing Data, Sharing Application, Sharing Service enforced 5. GUI 6. Service total Management about Access Grid utility Data sharing Remote controll client client direct access bridge client server server server Manage total applications through different programs Manage total applications through one program AG 1.2 AG 2.0
Client Services & Applications • Enable face-to-face meeting activities • What can be done: • Sharing Data • Shared Applications • Text chat • Applications: • Shared Presentation • Shared Web browser • Whiteboard • Voting Tool • Question & Answer Tool • Shared Desktop Tool • Integrate legacy single-user apps • Communicates media addresses to node service
uMeteo-K CG Testbed • uMeteo-K computational grid test-bed • (Two clusters utilized and each cluster has 4 nodes) • < A node’s specification>
uMeteo-K CG Testbed Configuration UPS NAS storage sever 4 nodes ( single CPU ) cluster NAS storage sever 10/100 switch hub 4 nodes ( single CPU ) cluster Monitoring system KOREN 10/100 Ethernet Electrometer
Master A Master B CA-B CA-A slaves slaves PBS PBS Globus linkage between testbed clusters • Independent simple CA has installed at each master node. • Simulating the severe weather, typhoon Mae-mi • A group of slave nodes is controlled by each master node’s PBS • scheduler
CA information of each cluster - CA-A subject : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus identity : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u533 timeleft : 7:52:58 - CA-B subject : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 identity : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u535 timeleft : 7:52:43
MM5 (Mesoscale Model Version 5) Non-hydrostatic NWP model developed by PSU/NCAR KMA’s operational model for short-term forecast Parallel NWP model for realtime short-term forecast
Precipitation, MSLP & Wind for 24 HRs Sep. 12, 2003 09:00 LST~ Sep. 13, 2003 09:00 LST Precipitation MSLP and Wind
Parallel MM5 Benchmarks with GLOBUS • Average job waiting time (including CA) : 25 sec • The required time for 3600 sec (1 hour) model integration • The required time for 86400 sec (1 day) model integration
Connecting to KISTI testbed - CA infomation subject : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy/CN=proxy issuer : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy type : full strength : 512 bits path : /tmp/x509up_p2358.fileEaIRDc.1 timeleft : 7:28:17 - Conneting to venus.gridcenter.or.kr using gsissh gsissh -p 2222 venus.gridcenter.or.kr - File transfer to venus.gridcenter.or.kr using gsincftpput and gsincftpget gsincftpput -p 2811 venus.gridcenter.or.kr ./ ~/MM5.tar gsincftpput -p 2811 venus.gridcenter.or.kr ./ ~/CReSS.tar
Running weather numerical model • It is successful to compile MM5 ( Mesoscale Model version 5 ) and CReSS ( Cloud resolving Storm Simulator) • It is successful to run CReSS, but fail to run MM5 because testbed is not supported some library and routine. - rsl file to run CReSS + ( &(resourceManagerContact="venus.gridcenter.or.kr") (count=8) (label="subjob 0") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 0) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") ) ( &(resourceManagerContact="jupiter.gridcenter.or.kr") (count=8) (label="subjob 8") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 1) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") )
CReSS (Cloud Resolving Storm simulator) Research & development in Nagoya Univ., Japan Non-hydrostatic compressible dynamic model • Meso-to-cloud scale simulation system • Inclusion of a detailed cloud . microphysical processes It takes many times for simulation. Performance of parallel processing
Downburst Simulation for 21000 sec. < Wind vector in y section [m s-1] >
Parallel CReSS Benchmarks with GLOBUS • The required time for 21000 sec (1 hour) model integration in uMeteoK Testbed • The required time for 21000 sec (1 hour) model integration in KISTI Testbed
Data Grid in Globus Tools Data Transport & Access • GASS - Simple, Multi-protocol file transfer Tools, integrated with GRAM • GridFTP - Provides high-performance, reliable data transfer for modern WANS Data Replication and Management • Data Replica Catalog - Provides a catalog service for keeping track of replicated datasets • Replica Management - Provides services for creating and managing replicated datasets
Data grid for climate prediction Data transportation Atmospheric e-science Data Grid Wu-Ftp Connecting Model output KMA NCEP KMA Supercom Model output Forecast output Data input Data input Model output SNU PKNU NASA Forecast output Forecast output Observation data KISTI Supercom NASA Model output
Pukyong National University Seoul National University Korea Meteorology Administration Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU Hardware structure of Data-GRID neosky15 KOREN Network Disk Raid 2.3TB cdldata Disk Raid 1.2TB KOREN Network Disk Raid 1.8TB cpsdata KOREN Network Disk Raid 500G Disk Raid 500G pknuGB01 climate
Globus-replica -management ldapadd Globus-replica -catalog Globus-replica -management Globus-replica -Management& catalog Globus-replica -management Globus-replica- management ldapadd Globus-replica -management Register location for test-collection (snu-kma-pknu) Delete the files & location List all location where some files can be found Copy file snu->kma or snu->pknu Overview Data grid Adding the uMeteoK-DataGrid to LDAP server Adding the Test-catalog to LDAP server Creating the Test-Collection in the test-catalog Create & Register & list file-list for all location Delete the Test-collection & Test-catalog
Globus Tool kit Grid-Proxy-Init Data transfer & Access Using ‘NCFTP’ Programs reliable and re-startable data transfer parallel and Striped Data transfer Automatic negotiation of TCP buffer/window sizes Grid FTP Data grid middleware User certification GSINCFTP Using ‘globus-url-copy’ Secured ftp Including Globus Tool kit Grid-ftp
Data Grid in Globus Tools Globus Toolkit install-process Grid Software : Globus 2.4 Toolkit CA Software : SimpleCA Grid-proxy-init Globus-personal-gatekeeper Globusrun Result of run /bin/data It appear that Test Result of Globus Software(globusrun) is successful
SNU-SNU - Cdldata climate SNU-KMA - Cdldata Neosky15 SNU-PKNU - Cdldata PknuGB01 Data Grid in Globus Tools GSINCFTP Testing - Successful