Why does my Simulink simulation run slowly when it includes Deep Learning blocks?

When using Deep Learning blocks in Simulink, the execution time for the simulation is very slow (nearly 5 minutes) however, in the past, the simulation ran in less than 30s.
What is the cause of this performance difference and how can I resolve it?

 Accepted Answer

This is likely due to the fact that Simulink ran in interpreted execution mode rather than generating a MEX file for the MATLAB Function block within the Deep Learning block.
The requirements for code generation of Deep Learning blocks during Simulink simulations are:
1. Install a supported compiler. (Microsoft Visual Studio in Windows)
2. Install the add-on MATLAB Coder Interface for Deep Learning Libraries.
3. Ensure that all the layers in the deep neural network in your model support code generation for the MKL-DNN target library.  For this, you can use "analyzeNetworkForCodegen" or check the following documentation page:
4. Set the simulation target language to C++.
If the above conditions are not met, the simulation will run in interpreted execution mode and the MATLAB engine will be called for each, resulting in slower speed.
More information on improving performance and an example can be found in the following documentation:

More Answers (0)

Categories

Find more on General Applications in Help Center and File Exchange

Products

Release

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!