240 likes | 398 Views
CCS Overview. Rene Salmon Center for Computational Science. Introduction. What is CCS? June 2001 Establish new collaborations Infrastructure to exchange ideas Interdisciplinary research Computational science research High end workstations HPC Hardware and Software. Software.
E N D
CCS Overview Rene Salmon Center for Computational Science
Introduction • What is CCS? • June 2001 • Establish new collaborations • Infrastructure to exchange ideas • Interdisciplinary research • Computational science research • High end workstations • HPC Hardware and Software
Software • Visualization • Techplot, AVS • Compilers • SGI, Absoft • Math Libraries • IMSL, BLAS • Finite Element Modeling • ABAQUS, PATRAN • Molecular Dynamics • NAMD, Gaussian, Amber, VMD • Matlab, Mathematica
SGI 4 compute nodes 32 CPUs 700MHZ R16000 MIPS 8 GB RAM Memory bandwidth 3.2GB/sec peak NUMAlink interconnect 1.6GB/sec each direction 1 TB SGI storage array Linux Cluster 34 nodes 68 CPUs 2.4 GHZ AMD Opteron 68 GB RAM Memory Bandwidth 12.8 GB/s Gigabit Ethernet interconnect 85MB/sec 1 TB storage array Multiprocessor Machines
Single OS Easier to program OpenMP Inter-processcommunication Multiple OS Harder to Program MPI Inter-processcommunication Multiprocessor Machines
High cost Complex Hardware Support contract Proprietary software Irix Compilers Low cost Commodity parts Community driven support Open source software Linux Compilers Multiprocessor Machines
OpenMP OpenMP(Open specifications for Multiprocessing) Library and compiler directives Shared memory Process synchronization Thread based MPI MPI(Message Passing Interface) Libraries Distributed memory Process based Process synchronization Master/slave mode Parallel Programming
Share a single address space Access one another's variables Time & memory Interprocess communication Process Threads
OpenMP program foobar …. !$omp parallel do do i=1, n z(i)=a*x(i)+b enddo end program foobar
MPI program foobar use mpi …. call MPI_INIT(…) call MPI_COMM_RANK(..,myid,..) call MPI_COMM_SIZE(..,numprocs,..) data_chunk=SIZE_X/numprocs j=1+myid*data_chunk n=j+(data_chunk-1) x_local=x(j:n) do i=1, data_chunk z_local(i)=a*x_local(i)+b enddo call MPI_GATHER(z_local, …,z,…) call MPI_FINALIZE(… end program foobar
Queuing System PBSPro • Resource manager • Schedules/decides when job run • Allocates resources to jobs • Full featured • Supports preemption • Priorities • Supports parallel and single CPU jobs
Q1: Lowest priority Access to all Tulane community for research purposes only. Q2: Provide intellectually to the leadership of CCS Giving (or arrange) seminars Serving on CCS committees Q3: Financially support from individual grants Personnel Computer/Software purchases Computer/Software maintenance Q4: Highest priority Faculty and students with CCS-funded projects CCS Queuing System
Grid Computing • Login to Server • Compile • Move or prepare data • Create and submit Job script to queue • Monitor status • Get results • Move data • Visualization
Grids Nationally • National Lambda Rail (NLR) • Nationwide optical fiber infrastructure • Open Science Grid • DOE and NSF Roadmap • Join U.S. labs and universities into a single, managed grid • Goal: Build a national grid infrastructure for benefit of scientific applications
LONI: Louisiana Optical Network Initiative • March of 2004 secured NLR membership • Louisiana Board of Regents, Tulane, LSU • State allocated $40 million to create and maintain LONI • What is LONI? • Statewide optical network • Inter-connect universities and colleges • Take advantage of NLR access • 40 Gbps • 1000 times faster
LONI: Louisiana Optical Network Initiative • LONI Members • Tulane University, Tulane HSC • LSU, LSU Medical Centers in Shreveport and New Orleans • Louisiana Tech University • University of Louisiana at Lafayette • Southern University • University of New Orleans
LONI: Louisiana Optical Network Initiative • Provide NLR access • High-quality, high-definition videoconferencing • High-speed access to data • Remote visualization • Remote instrumentation • High Performance Computing • Collaborative research projects and grants • Attract better research faculty • Increased potential of receiving national and international grant funding
LONI: Louisiana Optical Network Initiative • End of summer 2005 • $500,00.00 High Performance computer • All Connected via LONI • Tulane Pilot Grid • SURA test bed • Experience • Grid Research
Accessing Resources • Go to website: http://www.ccs.tulane.edu • Resource request form • Access local CCS and national Grid resources