190 likes | 334 Views
Contemporary Languages in Parallel Computing. Raymond Hummel. Current Languages. Standard Languages. Distributed Memory Multiprocessors MPI Shared Memory Multiprocessors OpenMP pthreads Graphics Processing Units CUDA OpenCL. MPI. Stands for: Message Passing Interface Pros
E N D
Contemporary Languages in Parallel Computing Raymond Hummel
Standard Languages • Distributed Memory Multiprocessors • MPI • Shared Memory Multiprocessors • OpenMP • pthreads • Graphics Processing Units • CUDA • OpenCL
MPI • Stands for: Message Passing Interface • Pros • Extremely Scalable • Portable • Can harness a multitude of hardware setups • Cons • Complicated Software • Complicated Hardware • Complicated Setup
OpenMP • Stands for: Open Multi-Processing • Pros • Incremental Parallelization • Fairly Portable • Simple Software • Cons • Limited Use-Case
POSIX Threads • Stands for: Portable Operating System Interface Threads • Pros • Portable • Fine Grained Control • Cons • All-or-Nothing • Complicated Software • Limited Use-Case
CUDA • Stands for: Compute Unified Device Architecture • Pros • Manufacturer Support • Low Level Hardware Access • Cons • Limited Use-Case • Only Compatible with NVIDIA Hardware
OpenCL • Stands for: Open Compute Language • Pros • Portability • Heterogeneous Platform • Works with All Major Manufacturers • Cons • Complicated Software • Special Tuning Required
Developing Languages • D • Rust • Harlan
D • Performance of Compiled Languages • Memory Safety • Expressiveness of Dynamic Languages • Includes a Concurrency Aware Type-System • Nearing Maturity
Rust • Designed for creation of large Client-Server Programs on the Internet • Safety • Memory Layout • Concurrency • Still Major Changes Occurring
Harlan • Experimental Language • Based on Scheme • Designed to take care of boilerplate for GPU Programming • Could be expanded to include automatic scheduling for both CPU and GPU, depending on available resources.