C-RNN dual output regression
22 views (last 30 days)
Show older comments
Hi. I am writing a C-RNN regression learning code with single matrix input - dual scalar output. The loaded "paddedData2.mat" file is saved as paddedData, and it is stored as an N X 3 cell, as shown in the attached image. The input matrix used for training is the 3rd column of paddedData, which is [440 5] double, and the regression variable is the values in the 1st column. With this, I plan to create features of size [436 1] using two [3 3] kernels of convolution and train them using LSTM. The code is as follows. But it doesn't work and the error code "trainnet (line 46), Error forming mini-batch of targets for network output "fc_1". Data interpreted with format "BC". To specify a different format use the TargetDataFormats option."
How can I modify the code?
clc;
clear all;
load("paddedData2.mat","-mat")
XTrain = paddedData(:,3);
YTrain1 = cell2mat(paddedData(:,1));
YTrain2 = cell2mat(paddedData(:,2));
dsX = arrayDatastore(XTrain, 'OutputType', 'same');
dsY1 = arrayDatastore(YTrain1, 'OutputType', 'same');
dsY2 = arrayDatastore(YTrain2, 'OutputType', 'same');
net = dlnetwork;
tempNet = [
sequenceInputLayer([440 5 1],"Name","sequenceinput")
convolution2dLayer([3 3],8,"Name","conv_A1")
batchNormalizationLayer("Name","batchnorm_A1")
reluLayer("Name","relu_A1")
convolution2dLayer([3 3],8,"Name","conv_2")
batchNormalizationLayer("Name","batchnorm_2")
reluLayer("Name","relu_2")
flattenLayer("Name","flatten")
fullyConnectedLayer(100,"Name","fc")
lstmLayer(100,"Name","lstm","OutputMode","last")];
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_1");
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_2");
net = addLayers(net,tempNet);
clear tempNet;
net = connectLayers(net,"lstm","fc_1");
net = connectLayers(net,"lstm","fc_2");
net = initialize(net);
options = trainingOptions('adam', ...
'MaxEpochs', 2000, ...
'MiniBatchSize', 100, ...
'Shuffle', 'every-epoch', ...
'Plots', 'training-progress');
lossFcn = @(Y1,Y2,dsY1,dsY2) crossentropy(Y1,dsY1) + 0.1*mse(Y2,dsY2);
net = trainnet(dsX, net, lossFcn, options);
0 Comments
Accepted Answer
Angelo Yeo
on 20 May 2024
The error message is like below.
Error using trainnet (line 46)
Error forming mini-batch of targets for network output "fc_1". Data interpreted with format "BC".
To specify a different format, use the TargetDataFormats option.
Error in main (line 52)
net = trainnet(dsX, net, lossFcn, options);
Caused by:
Index exceeds the number of array elements. Index must not exceed 1.
If you want to use a neural network with multiple input or output, you can use combine to make a CombinedDatastore. This will help you solve the mini-batch issue.
clc; clear;
% load("paddedData2.mat","-mat")
paddedData = cell(10, 3); % Intentionally changed the #observations from 900 to 10 for reproduction purpose
paddedData(:,1) = cellfun(@(x) 405, paddedData(:,1), 'UniformOutput', false);
paddedData(:,2) = cellfun(@(x) 10, paddedData(:,2), 'UniformOutput', false);
paddedData(:,3) = cellfun(@(x) rand(440, 5), paddedData(:,3), 'UniformOutput', false);
XTrain = paddedData(:,3);
YTrain1 = cell2mat(paddedData(:,1));
YTrain2 = cell2mat(paddedData(:,2));
dsX = arrayDatastore(XTrain, 'OutputType', 'same');
dsY1 = arrayDatastore(YTrain1, 'OutputType', 'same');
dsY2 = arrayDatastore(YTrain2, 'OutputType', 'same');
dsTrain = combine(dsX, dsY1, dsY2);
net = dlnetwork;
tempNet = [
sequenceInputLayer([440 5 1],"Name","sequenceinput")
convolution2dLayer([3 3],8,"Name","conv_A1")
batchNormalizationLayer("Name","batchnorm_A1")
reluLayer("Name","relu_A1")
convolution2dLayer([3 3],8,"Name","conv_2")
batchNormalizationLayer("Name","batchnorm_2")
reluLayer("Name","relu_2")
flattenLayer("Name","flatten")
fullyConnectedLayer(100,"Name","fc")
lstmLayer(100,"Name","lstm","OutputMode","last")];
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_1");
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_2");
net = addLayers(net,tempNet);
clear tempNet;
net = connectLayers(net,"lstm","fc_1");
net = connectLayers(net,"lstm","fc_2");
net = initialize(net);
options = trainingOptions('adam', ...
'MaxEpochs', 50, ...
'MiniBatchSize', 100, ...
'Shuffle', 'every-epoch', ...
'Plots', 'none');
lossFcn = @(Y1,Y2,dsY1,dsY2) crossentropy(Y1,dsY1) + 0.1*mse(Y2,dsY2);
net = trainnet(dsTrain, net, lossFcn, options);
You can refer to the following example page for detailed explanation on multiple input/output networks.
0 Comments
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!