learnk
(To be removed) Kohonen weight learning function
learnk will be removed in a future release. For more information,
see Transition Legacy Neural Network Code to dlnetwork Workflows.
For advice on updating your code, see Version History.
Syntax
[dW,LS] = learnk(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)
info = learnk('code')
Description
learnk is the Kohonen weight learning function.
[dW,LS] = learnk(W,P,Z,N,A,T,E,gW,gA,D,LP,LS) takes several
inputs,
W |
|
P |
|
Z |
|
N |
|
A |
|
T |
|
E |
|
gW |
|
gA |
|
D |
|
LP | Learning parameters, none, |
LS | Learning state, initially should be =
|
and returns
dW |
|
LS | New learning state |
Learning occurs according to learnk’s learning parameter, shown
here with its default value.
LP.lr - 0.01 | Learning rate |
info = learnk(' returns useful
information for each code')code character vector:
'pnames' | Names of learning parameters |
'pdefaults' | Default learning parameters |
'needg' | Returns 1 if this function uses |
Examples
Here you define a random input P, output A, and
weight matrix W for a layer with a two-element input and three
neurons. Also define the learning rate LR.
p = rand(2,1); a = rand(3,1); w = rand(3,2); lp.lr = 0.5;
Because learnk only needs these values to calculate a weight change
(see “Algorithm” below), use them to do so.
dW = learnk(w,p,[],[],a,[],[],[],[],[],lp,[])
Network Use
To prepare the weights of layer i of a custom network to learn with
learnk,
Set
net.trainFcnto'trainr'. (net.trainParamautomatically becomestrainr’s default parameters.)Set
net.adaptFcnto'trains'. (net.adaptParamautomatically becomestrains’s default parameters.)Set each
net.inputWeights{i,j}.learnFcnto'learnk'.Set each
net.layerWeights{i,j}.learnFcnto'learnk'. (Each weight learning parameter property is automatically set tolearnk’s default parameters.)
To train the network (or enable it to adapt),
Set
net.trainParam(ornet.adaptParam) properties as desired.Call
train(oradapt).
Algorithms
learnk calculates the weight change dW for a
given neuron from the neuron’s input P, output A,
and learning rate LR according to the Kohonen learning rule:
dw = lr*(p'-w), if a ~= 0; =
0, otherwise
References
Kohonen, T., Self-Organizing and Associative Memory, New York, Springer-Verlag, 1984
Version History
Introduced before R2006aSee Also
Time Series
Modeler | fitrnet (Statistics and Machine Learning Toolbox) | fitcnet (Statistics and Machine Learning Toolbox) | trainnet | trainingOptions | dlnetwork