Mathematical equation of regression in ANN for two hidden layers

For the following equation of regression for neural network,
y = b2 + LW*tanh(b1+IW*x)
Can this equation be used for two hidden layers? Or is it only valid for one hidden layer? If not, how can I get the equation for two or more hidden layers?
I'm using an ANN with two hidden layers since it gives a better mse for my model.
Thanks.

2 Comments

1. You can do just as well with one hidden layer. That configuration is a universal approximator.
2. [ I N ] = size(input)?
3. [ O N ] = size(target)?
4. net = fitnet([? ? ])
5. Sizes of IW, b1, b2, LW ?
Greg
Dear Greg,
I am trying to find out output of trained neural network in two ways ...one through equations which are equivalent to implemented neural network (with weights and biases) and one through simulation of neural network. Outputs from both the methods I am getting are not same. So I want to know why is it so and how to make both same? plz let me know ur view. Thanks
In below code y2 and Ya are not getting same and mse of ya with target is larger than y2 with target.
yn = B2 + LW22 * logsig( B1 + LW11 * xn )
ya = mapminmax.reverse(yn,tsettings) % output from equations of NN
y2=net(x) % output from simulated NN
mse1=mse(ya,t)
mse2=mse(y2,t)

Sign in to comment.

 Accepted Answer

Greg Heath
Greg Heath on 16 Sep 2015
Edited: Greg Heath on 16 Sep 2015
yn = B3+ LW2* tanh( B2+ LW1* tanh( B1+ IW* xn ))
Where xn are normalized input values obtained from MAPMINMAX and yn is a normalized output value that can be denormalized using reverse MAPMINMAX target(t) parameters.
B1, B2 and B3 are constant matrices whose dimensions depend on the sizes of x, t and [ H1 H2 ].
Weight values are carefully determined from net.b, net.IW and net.LW.
Hope this helps.
Thank for formally accepting my answer
Greg

3 Comments

Thank you for the answer!.
Is there any source or citations for the equations? (I'm doing an university research paper)
I doubt it. Just quote my post. (Seriously).
However, making the associations with the outputs of net.b, net.IW and net.LW is not easy.
If you figure it out, please post.
I will do the same.
Greg
This is getting interesting because I am using hidden layers of different sizes. The familiar conversion
LW = cell2mat(net.LWc)
is not working because the cell components are matrices of different sizes.
More later.
Greg

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

MHA
on 15 Sep 2015

Commented:

on 3 Apr 2017

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!