Custom Performance Function for shallow Neural Networks using +MSE package

52 views (last 30 days)
I would like to create a custom performance function for shallow Neural Networks based on the +MSE package in MATLAB. How can I implement these functions? 

Accepted Answer

MathWorks Support Team
MathWorks Support Team on 9 Nov 2022
Edited: MathWorks Support Team on 9 Nov 2022
You can calculate neural network performance using the 'mse', 'mae', 'sse', 'sae' and 'crossentropy' functions. Each of these functions returns a particular error metric. For example, 'mse' returns the mean squared error. To use an error metric that these functions do not provide, you must create a custom performance function.
The easiest way to create a custom performance function is to use the +MSE package as a template. These steps describe how to create a performance function that calculates the Huber loss and how to use that performance function to train a network.
Note:
For deep learning tasks, if Deep Learning Toolbox™ does not provide the layers you need (including output layers that specify loss functions), then you can create a custom layer. To learn more, refer to
 
1. Copy and Rename the +MSE Package
On a Windows operating system, you can find all of the performance function packages in the <matlabroot>\toolbox\nnet\nnet\nnperformance folder where <matlabroot> is the path to the folder where MATLAB® is installed, for example, C:\Program Files\MATLAB\R2022b. Navigate to <matlabroot>\toolbox\nnet\nnet\nnperformance in the Windows File Explorer. The folder nnperformance contains the 'mse' function (mse.m) and the folder +mse.
Make a copy of "mse.m" in your current folder and rename it to "huber.m". Make a copy of the folder +mse in your current folder and rename it to "+huber".
2. Edit the File +huber\apply.m
The file +huber\apply.m contains the main performance calculation function. Edit the function to calculate the Huber loss. Here, the transition point δ = 1 is used.
function perfs = apply(t,y,e,param)
delta = 1;
absoluteError = abs(e);
perfs = delta*absoluteError - 0.5*delta^2;
perfs(absoluteError < delta) = 0.5*(e(absoluteError < delta).^2);
end
3. Edit the File +huber\backprop.m
The file +huber\backprop.m defines a function that returns the backpropagation derivatives, that is, the derivatives of performance with respect to each output 'y'. Edit the function to calculate the backpropagation derivatives of the Huber loss.
function dy = backprop(t,y,e,param)
delta = 1;
absoluteError = abs(e);
dy = delta*sign(e);
dy(absoluteError < delta) = e(absoluteError < delta);
dy = -dy;
end
4. Edit the File +huber\name.m
The file +huber\name.m defines a function that returns the name of the new performance function. Edit the function to return "Huber Loss".
function name = name
name = "Huber Loss";
end
5. Train Network Using Huber Loss Performance Function
Load the simplefit sample dataset comprising 94 input values and 94 associated target values. Create a function fitting neural network with a hidden layer size of 10.
[x,t] = simplefit_dataset;
net = fitnet(10);
The default training function 'trainlm'  requires the performance to be the mean or sum of the squared errors and therefore only supports the 'mse' and 'sse' performance functions. Set the network training function to 'trainscg' and set the network performance function to huber.
net.trainFcn = "trainscg";
net.performFcn = "huber";
Train the network using the input values and target values.
net = train(net,x,t);
Note for R2012b Users
If you are using MATLAB R2012b, train the network using the 'nn7' option, for example, 
net = train(net,x,t,nn7);
Additional Files in the +MSE Package
The steps described in this answer can be modified to implement a variety of performance functions. The +MSE package also contains these files which you can modify to suit your application.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!