- I agree that gradient descent is vector quantity & points in the direction of maximum change of the cost function.
- The ‘net.trainParam.min_grad’ is a scalar(numeric) quantity. The parameter ‘min_grad’ denotes the minimum magnitude (which is scalar) of gradient descent (which is vector), for which the training of neural network terminates.
- When the magnitude of gradient descent becomes less than ‘min_grad’, the neural network model is said to be optimized (and hence, further training stops).
What is the parameter minimum performance gradient (trainParam.min_grad) of traingd?
8 views (last 30 days)
Show older comments
AntonyH
on 25 Sep 2020
Commented: Mohamed Elsefy
on 12 Nov 2020
I use the training function "traingd" to train a shallow neural network:
trainedNet = train(net,X,T)
For the training function "traingd": How is the parameter minimum performance gradient (net.trainParam.min_grad) defined?
As the gradient for the gradient descent is usually a vector, but net.trainParam.min_grad is a scalar value, I am confused.
Is it the change in the performace (loss) between 2 iterations, and if yes: Does it refer to the training, validation or testing errror?
Thanks in advance!
I use MATLAB 2013 and 2015 with the neural network toolbox.
0 Comments
Accepted Answer
Rishabh Mishra
on 28 Sep 2020
Edited: Rishabh Mishra
on 28 Sep 2020
Hi,
Based on your description of the issue, I would state a few points:
For better understanding, refer the following links:
Hope this helps.
2 Comments
More Answers (0)
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!