How to run my code in GPU instead of CPU?
Show older comments
Hello everyone,
I'm trying to run my for loop using GPU instead of CPU, but i searched a lot and it turned out that I should rewrite my code again
but some people said it's not necessary to rewrite it, you could use some tool box or something like that
so, if anyone could help me how to use these tool boxes or how can i rewrite my code in GPU form, please answer me
This is my code:
x = [1 5 9];% input features
x = [1/9 5/9 1]; % normlize
y = [0.49 0.51]; % outputs
min_error = 100;
for i = 1:10000
w1 = rand(3,3);
b1 = rand(1,3); % first bias , 1 value going to 3 neurons
w2 = rand(3,3);
b2 = rand(1,3); % second bias
w3 = rand(3,2);
b3 = rand(1,2); % going to 2 neurons
% L1 input layer, L2 and L3 hidden layers , L4 output layer
% L1:
a1 = x;
% L1 - L2:
zh1 = (x * w1) + b1;
a2 = logsig(zh1);
% L2 - L3:
zh2 = (a2 * w2) + b2;
a3 = logsig(zh2);
% L3 - L4:
zh3 = (a3 * w3) + b3;
a4 = logsig(zh3);
err = abs(y-a4);
percentage(i) = (err / y) * 100;
if percentage(i) < min_error
min_error = percentage(i);
cw1 = w1;
cw2 = w2;
cw3 = w3;
bw1 = b1;
bw2 = b2;
bw3 = b3;
end
end
display(min_error)
Accepted Answer
More Answers (0)
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!