How to "prune" a one hidden layer NN such that the off diagonal elements of the weights, that is, "net.IW{1}" and "net.LW{2}", are zeros?

2 views (last 30 days)
How can I create a single hidden layered NN such that the off main diagonal elements of the weights, that is, "net.IW{1}" and "net.LW{2}", are zeros?
Please see my partial script below with a question there. Can someone help me with this toy example?
%% Create a dataset with a network
[x,t] = crab_dataset; %load the dataset
x = x(1:2,:); % Input length
net = patternnet(2); %create the network
net = configure(net,x,t);
view(net);
w1 = net.IW{1}; %the input-to-hidden layer weights (but w1
w2 = net.LW{2}; %the hidden-to-output layer weights
b1 = net.b{1}; %the input-to-hidden layer bias
b2 = net.b{2}; %the hidden-to-output layer bias
%% QUESTION (or INTENTION):
% How to setup the net such that the off diagonal elements of w1 or net.IW{1} and w2 or net.LW{2} are ZEROs after training the network?
%% then, I would like to train the network with my defined network!
% [net,tr] = train(net,x,t); %train the network

Answers (2)

Srivardhan Gadila
Srivardhan Gadila on 29 May 2020
As per my knowledge w.r.t shallow nerual networks you cannot freeze non-diagonal weights & make the diagonal wieghts only to update since the property net.layerWeights{i,j}.learn is defined for the entire connections between layers i and j.
I would suggest you to use Deep Nerual networks instead of shallow nerual networks & define custom Deep Learning layer to achieve your functionality. Refer to Define Custom Deep Learning Layers & Define Custom Deep Learning Layer with Learnable Parameters
Other suggestion w.r.t shallow nerual networks approach: (may or may not be useful)
Set the net.trainParam.epochs to 1 & place the [net,tr] = train(net,x,t); in a for loop iterated over total number of epochs, then after each epoch set the non-diagonal weights to zero.
  1 Comment
Shashi Kant
Shashi Kant on 29 May 2020
Edited: Shashi Kant on 29 May 2020
@Srivardhan Gadila: Thank you for your comment. Would it be possible for you to create a simple example with the DNN toolbox? I would greatly appreciate it.

Sign in to comment.


Abdelwahab Afifi
Abdelwahab Afifi on 12 Apr 2021
https://uk.mathworks.com/help/deeplearning/ref/prune.html

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!