Neural net visualisation
5 views (last 30 days)
Show older comments
I'm able to produce neural nets that work but I want to know what the resulting network looks like. Ideally some sort of graph with edges for each neuron showing their weights. If this does not exist, could someone give me a description of the network object output that will let me find and understand the weights myself?
0 Comments
Accepted Answer
Greg Heath
on 23 Nov 2011
The simplest solutions use integer weights, two hidden nodes with sign activation functions, purelin output activation functions and polar binary targets [-1 1 1 -1].
Since there are only 4 training equations for estimating (2+1)*2+(2+1)*1 = 9 weights, solutions are far from being unique without other constraints.
You can play around with the equations and determine your own version of "simplest solutions".
These can be easily transformed into solutions for continuous activation functions by replacing sign with tansig. The transformation of purelin to tansig or logsig is also straightforward.
Good Luck.
Greg
0 Comments
More Answers (2)
Greg Heath
on 4 Nov 2011
You can print the weights. However, it is generally
impossible to understand the weights of a regression
MLP for a medium sized RW problem
On the other hand, the weights of a classification
RBF or EBF are more readily understood.
Moreover, once you get all the interclass distances you can use
multidimensional scaling (MDS) to get a 2-D visualization. The
form of MDS that I have used is called Sammon Mapping. You
will have to Google to find code.
Hope this helps.
Greg
P.S. I have never used a weight graph. However, you may be able to find code on the internet.
0 Comments
John Dickson
on 4 Nov 2011
Thanks, I am looking into your suggestions. It may well be that it is not possible to understand medium-sized nets but it would really help me to understand what's going on if you might explain a simple net. The classic XOR problem can be solved by two and gates and an or: I.e. [A and ~B] or [B and ~A]. Using the code:
input = [0 0; 0 1; 1 0; 1 1]'; %'# each column is an input vector
ouputActual = [0 1 1 0];
net = newpr(input, ouputActual, 3); %# 1 hidden layer with 3 neurons
net.divideFcn = ''; %# use the entire input for training
net = init(net); %# init
[net,tr] = train(net, input, ouputActual); %# train
outputPredicted = sim(net, input); %# predict
The solution net has input weights:
net.iw{1} =
-1.6875 2.0152
-2.3719 -1.3529
1.5600 2.8228
Layer weights:
net.lw{2} =
-2.6606 3.7609 3.7860
And biases:
net.b{1} =
2.6314
1.3409
2.0613
net.b{2} = -1.2402
This is very confusing. Do you know the simplest net that will solve this problem? Is it possible to constrain the optimisation so that the simplest solution is found?
0 Comments
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!