Clear Filters
Clear Filters

How can I write my own piecewise defined custom activation function that takes dlarray as input?

2 views (last 30 days)
I'm trying to write my own activation function for preventing my gradients from vanishing.
However I struggle with making the implementation suitable for dlarrays.
When using a "normal" array my code looks like the following and it works just fine:
f = tanh(x);
f(abs(tanh(x))>= 0.99) = x(abs(tanh(x))>= 0.99) * (0.99 / atanh(0.99));
When I use this function in the predict function I get a lot of errors regarding "reshaping" when trying to run the NN.
I'm grateful for any help provided!

Answers (1)

Kausthub
Kausthub on 7 Sep 2023
Hi Carina Haschke,
I understand that you are facing problems while creating a piecewise-defined custom activation function that has an input of type dlarray. Since you are getting a lot of errors regarding reshaping please crosscheck the input and output sizes of the layers. I believe that changing the array to dlarray will not introduce any new errors.
I have attached the requested custom activa tion function along with a sample neural network to test the layer. The example works for both the normal array as well as dlarray. Please refer to the snippets for more clarifications and change it according to your needs.
You could also refer to these articles for more information:
Hope this helps and clarifies your queries regarding creating custom activation functions with a dlarray as input!

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!