1 / 30

Parallel Programming in .NET

Parallel Programming in .NET. Kevin Luty. Agenda. History of Parallelism Benefits of Parallel Programming and Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries. History of Parallelism. Hardware

Download Presentation

Parallel Programming in .NET

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parallel Programming in .NET Kevin Luty

  2. Agenda History of Parallelism Benefits of Parallel Programming and Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries

  3. History of Parallelism • Hardware • 1960-1970’s see parallel hardware in supercomputers • Early 1980’s – super computer built with 64 8086/8087 microprocessors • Late 1980’s – Cluster computing power • Moore’s Law • Amdahl’s Law • Most recently – # of cores over clock speeds

  4. Moore’s Law Moore’s Law

  5. Amdahl’s Law

  6. History of Parallelism • Hardware • 1960-1970’s sees parallel hardware in supercomputers • Early 1980’s – super computer built with 64 8086/8087 microprocessors • Late 1980’s – Cluster computing power • Moore’s Law • Amdahl’s Law • Most recently – # of cores over clock speeds

  7. History of Parallelism • Software • One processor means sequential programs • Few APIs that promote/make use of parallel programming • 1990’s – No standards were created for parallel programming • By 2000 – Message Passing Interface (MPI), POSIX threads (pthreads), Open Mulitprocessing (OpenMP)

  8. Benefits of Parallel Programming • Task Parallel Library (TPL) for .NET • Hardware capabilities • Easy to write, PLINQ • You can use all the cores! • Timing • Tools available for debugging • Cost Effective

  9. What To Consider • Define: • What needs to be done? • What data is being modified? • Current state? • Cost • Synchronization vs. Asynchronization • Output = What pattern is best

  10. Defining Types of Parallelism Parallelism – Programming with multiple threads, where it is expected that threads will execute at the same time on multiple processors. Its goal is to increase throughput.

  11. Defining Types of Parallelism • Data Parallelism – When there is a lot of data, and the same operations must be performed on each piece of data • Task Parallelism – There are many different operations that can run simultaneously

  12. Parallel Loops • Perform the same independent operation for each element • Most common problem is not noticing dependencies • How to notice dependencies • Shared variables • Using properties of an object

  13. Parallel Loops

  14. Parallel Loops • Helpful Properties of TPL • .Break and .Stop • CancellationToken • MaxDegreeOfParallelism • Exception Handling • ExceptionAggregate

  15. Parallel Loops Partitioning Oversubscription and Undersubscription

  16. Parallel Aggregation Making use of Parallel Loops Makes use of unshared, local variables Multiple inputs, single output

  17. Parallel Aggregation Simple Example:

  18. Parallel Tasks • Data Parallelism • Also known as Fork/Join Pattern • Uses System.Threading.Task namespace • TaskFactory • Invoke • Wait/WaitAny/WaitAll • StartNew • Handling Exceptions

  19. Parallel Tasks • Handling Exceptions • Exceptions are deferred until Task is done • AggregateException • CancellationTokenSource • Can also cancel tasks outside of a Task

  20. Parallel Tasks Work Stealing

  21. Pipelines Seen like an assembly line

  22. Pipelines • Uses BlockingCollection<T> • CompleteAdding • Most problems in this design due to starvation/blocking

  23. Futures • Future dependencies • Wait/WaitAny/WaitAll • Sequential/Parallel Example

  24. Futures

  25. Futures Model-View-ViewModel

  26. Dynamic Task Parallelism Continuous Task adding Complete Small Tasks then Larger Tasks Binary Trees and Sorting

  27. Tools • .NET Performance Profiler (Red-Gate) • JustTrace (Telerik) • GlowCode (Electric Software) • Performance Profiler (Visual Studio 2010 Ultimate) • Concurrency Visualizer • CPU Performance • Memory Management

  28. Supportive Libraries for .NET • Task Parallel Library • PLINQ (Parallel Language Integrated Query) • Easy to learn • Rx (Reactive Extensions)

  29. References http://en.wikipedia.org/wiki/Moore%27s_law Campbell, Colin, et al. Parallel Programming with Microsoft .NET: Design Patterns for Decomposition and Coordination on Multicore Architectures. Microsoft, 2010. Print. Data Parallelism (n.d.). In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Data_parallelism Hillar, Gaston C. Professional Parallel Programming with C#: Master Parallel Extensions with .NET 4. Indiana: Wiley. 2011. Print. Rx Extensions (n.d.). In Microsoft. Retrieved from http://msdn.microsoft.com/en-us/data/gg577609.aspx. T. G. Mattson, B. A. Sanders, and B. L. Massingill. Patterns for Parallel Programming. Addison-Wesley, 2004.

More Related