Problem 58882. Neural Nets: Activation functions
Return values of selected Activation function type for value,vector, and matrices.
y=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax
ReLU: Rectified Linear Unit, clips negatives max(0,x) Trains faster than sigmoid
Sigmoid: Exponential normalization [0:1] 
HyperTan: Normalization[-1:1] tanh(x)
Softmax: Normalizes output sum to 1, individual values [0:1]
Used on Output node
Working though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist.
Might take a day or two to completely cover Neural Nets in a Matlab centric fashion.
Essentially Out=Softmax(ReLU(X*W)*WP)
Solution Stats
Problem Comments
-
1 Comment
Richard Zapor
on 21 Aug 2023
Multi-Case Softmax should be y=exp(x)./sum(exp(x),2)
Solution Comments
Show commentsProblem Recent Solvers13
Suggested Problems
-
the fly, the train, the second train, and their Zeno's paradox
74 Solvers
-
Construct a string from letters and counts
146 Solvers
-
Return unique values without sorting
987 Solvers
-
Volume difference between Ellipsoid and Sphere
133 Solvers
-
Basics: Divide integers to get integer outputs in all cases
135 Solvers
More from this Author308
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!