300 likes | 431 Views
Parallel Programming in .NET. Kevin Luty. Agenda. History of Parallelism Benefits of Parallel Programming and Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries. History of Parallelism. Hardware
E N D
Parallel Programming in .NET Kevin Luty
Agenda History of Parallelism Benefits of Parallel Programming and Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries
History of Parallelism • Hardware • 1960-1970’s see parallel hardware in supercomputers • Early 1980’s – super computer built with 64 8086/8087 microprocessors • Late 1980’s – Cluster computing power • Moore’s Law • Amdahl’s Law • Most recently – # of cores over clock speeds
Moore’s Law Moore’s Law
History of Parallelism • Hardware • 1960-1970’s sees parallel hardware in supercomputers • Early 1980’s – super computer built with 64 8086/8087 microprocessors • Late 1980’s – Cluster computing power • Moore’s Law • Amdahl’s Law • Most recently – # of cores over clock speeds
History of Parallelism • Software • One processor means sequential programs • Few APIs that promote/make use of parallel programming • 1990’s – No standards were created for parallel programming • By 2000 – Message Passing Interface (MPI), POSIX threads (pthreads), Open Mulitprocessing (OpenMP)
Benefits of Parallel Programming • Task Parallel Library (TPL) for .NET • Hardware capabilities • Easy to write, PLINQ • You can use all the cores! • Timing • Tools available for debugging • Cost Effective
What To Consider • Define: • What needs to be done? • What data is being modified? • Current state? • Cost • Synchronization vs. Asynchronization • Output = What pattern is best
Defining Types of Parallelism Parallelism – Programming with multiple threads, where it is expected that threads will execute at the same time on multiple processors. Its goal is to increase throughput.
Defining Types of Parallelism • Data Parallelism – When there is a lot of data, and the same operations must be performed on each piece of data • Task Parallelism – There are many different operations that can run simultaneously
Parallel Loops • Perform the same independent operation for each element • Most common problem is not noticing dependencies • How to notice dependencies • Shared variables • Using properties of an object
Parallel Loops • Helpful Properties of TPL • .Break and .Stop • CancellationToken • MaxDegreeOfParallelism • Exception Handling • ExceptionAggregate
Parallel Loops Partitioning Oversubscription and Undersubscription
Parallel Aggregation Making use of Parallel Loops Makes use of unshared, local variables Multiple inputs, single output
Parallel Aggregation Simple Example:
Parallel Tasks • Data Parallelism • Also known as Fork/Join Pattern • Uses System.Threading.Task namespace • TaskFactory • Invoke • Wait/WaitAny/WaitAll • StartNew • Handling Exceptions
Parallel Tasks • Handling Exceptions • Exceptions are deferred until Task is done • AggregateException • CancellationTokenSource • Can also cancel tasks outside of a Task
Parallel Tasks Work Stealing
Pipelines Seen like an assembly line
Pipelines • Uses BlockingCollection<T> • CompleteAdding • Most problems in this design due to starvation/blocking
Futures • Future dependencies • Wait/WaitAny/WaitAll • Sequential/Parallel Example
Futures Model-View-ViewModel
Dynamic Task Parallelism Continuous Task adding Complete Small Tasks then Larger Tasks Binary Trees and Sorting
Tools • .NET Performance Profiler (Red-Gate) • JustTrace (Telerik) • GlowCode (Electric Software) • Performance Profiler (Visual Studio 2010 Ultimate) • Concurrency Visualizer • CPU Performance • Memory Management
Supportive Libraries for .NET • Task Parallel Library • PLINQ (Parallel Language Integrated Query) • Easy to learn • Rx (Reactive Extensions)
References http://en.wikipedia.org/wiki/Moore%27s_law Campbell, Colin, et al. Parallel Programming with Microsoft .NET: Design Patterns for Decomposition and Coordination on Multicore Architectures. Microsoft, 2010. Print. Data Parallelism (n.d.). In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Data_parallelism Hillar, Gaston C. Professional Parallel Programming with C#: Master Parallel Extensions with .NET 4. Indiana: Wiley. 2011. Print. Rx Extensions (n.d.). In Microsoft. Retrieved from http://msdn.microsoft.com/en-us/data/gg577609.aspx. T. G. Mattson, B. A. Sanders, and B. L. Massingill. Patterns for Parallel Programming. Addison-Wesley, 2004.