problem in using inverse tansig in place of tansig in neural network

for creating a custom activation function i.e the inverse of tansig, I have created a folder with +myfun and myfun.m where i have changed the apply.m file with equation of inverse tansig a= -(log(1 ./ (n./2 + 1./2) - 1)./ 2);. When i train my network it is stoping at the 0 iterations and its showing R as a complex value. The same is working fine for the tansig.
Regards swathi

 Accepted Answer

I am confused
z = tansig(x) = tanh(x) = ( exp(x)-exp(-x) ) / ( exp(x) + exp(-x) )
x = atanh(z) = 0.5 * log( (1 + z) / ( 1- z) ) , abs(z) < 1
What is R?

5 Comments

I am trying to implement the inverse delayed function of neuron for which I need inverse tansig transfer function. My expression for x is same as the equation mentioned by you. By R I meant regression coefficient. We have changed the active input range to [-1 1], now the training is carried out for only 2 iterations and the R is very low. Please let me know if I am going wrong somewhere.
That is a very weird choice for a transfer function. I see no advantage in it. In particular, it looks like having inputs --> +/- 1 would result in disaster.
Do you mean atan ? That makes more sense.
I don't see how our expressions are equal.
OK..I see that the expressions are equal.
However, it is inappropriate for a transfer function.
On the other hand, atan should work.
Greg
In +tansig folder there are other m files like activeinputrange,outputrange, discontinuity, forwardprop..etc.For using atanh as activation function I have made changes in apply.m file.Please let me know whether there are any changes to be made in other mfiles.

Sign in to comment.

More Answers (0)

Asked:

on 1 May 2014

Commented:

on 3 May 2014

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!