Matlab 2018b GPU Training
    3 views (last 30 days)
  
       Show older comments
    
    Eduardo Gaona Peña
 on 6 Nov 2018
  
    
    
    
    
    Edited: Eduardo Gaona Peña
 on 8 Nov 2018
            Till now I have been training an LSTM network using the 2018a version of Matlab and didnt have a problem using my GPU as training device. However, since I needed to change the activation functions of my LSTM layers I updated Matlab and now when I try to use my GPU it trains the network way slower than using the CPU, which doesnt make sense. For some reasong Matlab is not using my gpu's memory. Any ideas how to solve this?
0 Comments
Accepted Answer
  Joss Knight
    
 on 7 Nov 2018
        
      Edited: Joss Knight
    
 on 7 Nov 2018
  
      Do you mean you switched to using hard-sigmoid or softsign activations? This is supported in 18b, but is a non-optimized version since it isn't supported by cuDNN, and is indeed much slower. I would recommend using the default activations for performance, if you can make it work.
1 Comment
More Answers (0)
See Also
Categories
				Find more on Deep Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
