Why second prediction is not costant when training a network?

1 view (last 30 days)
Dear all,
I am trying to find the best match between images using transfer learning convolutional neural network. I have 10 different classes of images. So I created 10 folders and I put each class together inside a folder.
Folder-1 (360 images of type 1)
Folder-2 (360 images of type 2)
Folder-3 (360 images of type 3)
...
...
Folder-10 (360 images of type 10)
STEP -1:
I used Alexnet and retrained the network for my image data set and obtained the net file from this training.
layers(23) = fullyConnectedLayer(10); %10 classes
layers(25) = classificationLayer
allImages = imageDatastore('WholeData_A', 'IncludeSubfolders', true, 'LabelSource', 'foldernames');
[trainingImages, testImages] = splitEachLabel(allImages, 0.8, 'randomize');
opts = trainingOptions('sgdm', 'InitialLearnRate', 0.001, 'MaxEpochs', 2, 'MiniBatchSize', 64, 'Plots','training-progress');
rng(0)
myNet_Experiment_X_A1 = trainNetwork(allImages, layers, opts);
save net
I also measured the accuracy of the training and obtained high accuracy = 1
% Measure network accuracy
predictedLabels = classify(myNet_Experiment_X_A1, testImages);
accuracy = mean(predictedLabels == testImages.Labels)
% ***** End of training ********
STEP -2:
Now I picked up an image "image_1.tif" from the same data set to predict the similary order for this image using the following code
Testing_Image = imread("image_1.tif");
p = predict(myNet_Experiment_X_A1, Testing_Image);
[p3,i3] = maxk(p,3); %Top 3 predictions
myNet_Experiment_X_A1.Layers(end)
All_posibilities = myNet_Experiment_X_A1.Layers(end).ClassNames(i3)
For example, if I picked an image "image_1.tif" from Folder-2, then I will have 3 predictions for this image
Predict-1: Folder-2 (because this image belong to this folder)
Predict-2: Folder-4
Predict-3: Folder-7
The problem is: when I repeat STEP-1 and STEP-2 for this same data set and testing the same image "image_1.tif" I get different results for Prediction-2 and Prediction-3 (Prediction-1 is constant and no problem).
Results from the second training as follows:
Predict-1: Folder-2 (OK, same as the first training)
Predict-2: Folder-7 (was Folder-4 in the first training)
Predict-3: Folder-5 (was Folder-7 in the first training)
Any idea what is wrong with this code or method?
Meshoo
  3 Comments
Adam
Adam on 13 Mar 2019
The training includes randomness doesn't it? So you wouldn't expect the same results every time.
e.g. you have this line
[trainingImages, testImages] = splitEachLabel(allImages, 0.8, 'randomize');
I haven't really used the Deep Network functionality in Matlab, but I know enough to know that randomness is usually involved and that line seems to strongly suggest this is the case in at least this place (and possibly others).
You can fix the random seed before running it though if you want repeatable results.
Ghost
Ghost on 25 Mar 2019
Thank you Adam. It is not because of Randomize option. It is because of using the GPU.

Sign in to comment.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!