Jacobian matrix of neural network
9 views (last 30 days)
Show older comments
what is inside of jacobian matrix ?I know that for a trained network with number of data :1,2,..., n is equall to the number of column in Jacobian matrix . what is rows?
0 Comments
Accepted Answer
Cam Salzberger
on 29 Feb 2016
Hello Rita,
The number of rows in the Jacobian output by "defaultderiv" is the sum of the number of weights and biases for the network. For example, if you do this to create the network:
[x,t] = simplefit_dataset;
net = feedforwardnet(10);
net = train(net,x,t);
y = net(x);
perf = perform(net,t,y);
dwb = defaultderiv('de_dwb',net,x,t);
Now "dwb" is the Jacobian of errors with respect to the net's weights and biases. It is a 31x94 matrix. If you check out the following properties in the network:
net.IW % Input weight matrices
net.LW % Layer weight matrices
net.b % Bias vectors
you can see that "net.IW" contains a 10x1 matrix, "net.LW" contains a 1x10 matrix, and "net.b" contains a 10-element vector and a 1-element vector. The number of elements adds up to 31.
I hope this helps clarify the Jacobian.
-Cam
1 Comment
MAHSA YOUSEFI
on 5 Feb 2022
Edited: MAHSA YOUSEFI
on 5 Feb 2022
Hi Cam.
I am following the answer of this question regarding Hessian in a deep network. Now, I see this answer. However, I am asking you a different way for computing Hessian, if there is.
I am using a training loop for my simple model in which gradients are computing by dlgradient. As you know, dlgradient (through dlfeval) returns a TABLE in which the layers, parameters (weights and bias) and gradients' values are stored. Also, we know that dlgradient accepts "loss" as a SCALLER and dlnet.Learnables, data samples dlX and targets dlY for these computations. I am interested in computing Hesseian for a small network using dlX and dlY. In fact I am going to compute a sub-sampled Hessian if I uses mini-batch dlX. (SO, I do not have problem for storing this matrix then!). May I ask you please let me know how it would be possible? (I put this question on Community titled "Computing Hessian by dllgradient" as well. Thanks...
More Answers (2)
Greg Heath
on 27 Feb 2016
The number of input variables
Hope this helps.
Thank you for formally accepting my answer
Greg
Monsij Biswal
on 19 Jun 2019
In which order are the derievatives present ? I am unable to figure it out what is the exact order columnwise. Is it layerwise starting from the first layer and then weights->biases for each layer or something else ?
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!