70 likes | 291 Views
Parallel Computing Using MPI. Parallel Computing MPI CS Lab setup Simple MPI program MPI data type and user defined MPI data types. Parallel Computing. Traditional computing is sequential. Only one instruction can be executed at any given moment in time
E N D
Parallel Computing Using MPI • Parallel Computing • MPI • CS Lab setup • Simple MPI program • MPI data type and user defined MPI data types
Parallel Computing • Traditional computing is sequential. Only one instruction can be executed at any given moment in time • Parallel computing uses multiple computers (or processors on the same machine) to execute multiple instructions simultaneously.
Parallel Computing • Motivations • Solve a problem more quickly • Solve very large problems (scalability issue) • Save money by using a cluster of cheap computers instead of a supercomputer
Parallel Computing • Cluster Computing: Multiple independent computers combined into a parallel computing unit • Hardware readily available • Message exchanges between nodes of the cluster are through message passing • 2 free message passing software • MPI: Message Passing Interface • PVM: Parallel Virtual Machine
MPI • Provides library functions for passing messages in a parallel environment • Fortran, C and C++ programs are written as normal and then linked against the MPI library
MPI • 2 software packages that support MPI for parallel computing • MPICH - free portable version of MPI • MPICH2 - all new implementation of MPI • lam-mpi – another implementation of MPI
CS Lab Setup • Lazar lab is setup with MPICH so each LINUX machine can be used as a node in a parallel computing cluster