Unfolding of recurrent neural nets for back-propagation training
Show older comments
When training a layrecnet (I believe this is the right function to use for a "standard" recurrent neural network), and training e.g. by the default trainlm method, how many time-steps are unfolded for the back-propagation? Can this number be specified and if so, how?
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!