80 likes | 296 Views
Multi-Layer Perceptron On A GPU. Scott Finley ECE 539 Fall 2008 UW-Madison. General Purpose GPU. Modern GPUs are have 100s of “stream processors” Can now be used for non-graphics computing nVida CUDA (used for this project) openCL. Three MLP Implementations.
E N D
Multi-Layer Perceptron On A GPU Scott Finley ECE 539 Fall 2008 UW-Madison
General Purpose GPU • Modern GPUs are have 100s of “stream processors” • Can now be used for non-graphics computing • nVida CUDA (used for this project) • openCL
Three MLP Implementations • Basic Linear Algebra Subprograms (BLAS) • CPU-Only • nVidia’scuBLAS library • No explicit GPU use, library uses GPU “under the hood” • Lots of copies of data from CPU to GPU • cuBLAS with CUDA • Same cuBLAS use as above, non-BLAS operations done with CUDA.
Classifying Forestry Data • Data from US forestry service • Large feature vectors: 54 • Large number of training samples: 500 per epoch • Two hidden layers • Number of neurons per layer varied
Conclusion • GPU is very powerful parallel processor • Up to two orders of magnitude improvement possible • Much more effective for large comutations • Many improvements possible • CUDA-only version needed