Mathematical equation of regression in ANN for two hidden layers
Show older comments
For the following equation of regression for neural network,
y = b2 + LW*tanh(b1+IW*x)
Can this equation be used for two hidden layers? Or is it only valid for one hidden layer? If not, how can I get the equation for two or more hidden layers?
I'm using an ANN with two hidden layers since it gives a better mse for my model.
Thanks.
2 Comments
Greg Heath
on 16 Sep 2015
1. You can do just as well with one hidden layer. That configuration is a universal approximator.
2. [ I N ] = size(input)?
3. [ O N ] = size(target)?
4. net = fitnet([? ? ])
5. Sizes of IW, b1, b2, LW ?
Greg
jalpa shah
on 3 Apr 2017
Dear Greg,
I am trying to find out output of trained neural network in two ways ...one through equations which are equivalent to implemented neural network (with weights and biases) and one through simulation of neural network. Outputs from both the methods I am getting are not same. So I want to know why is it so and how to make both same? plz let me know ur view. Thanks
In below code y2 and Ya are not getting same and mse of ya with target is larger than y2 with target.
yn = B2 + LW22 * logsig( B1 + LW11 * xn )
ya = mapminmax.reverse(yn,tsettings) % output from equations of NN
y2=net(x) % output from simulated NN
mse1=mse(ya,t)
mse2=mse(y2,t)
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!