Vectoring a nested for loop containing matrix indexing
2 views (last 30 days)
Show older comments
I'm trying to optimize this piece of a code a bit, and usually my first instict is to vectorize it. I cannot for the life of me figure out how to do this. Is it even possible? I am leaning towards no, but I have a gut feeling that it can be done.
for i = 1:1:region_max_y-template_max_y
for j = 1:1:region_max_x-template_max_x
cc(i,j) = sum(abs(template_gradient(:)-...
reshape(region_gradient(i:i+template_max_y-1,j:j+template_max_x-1,:),...
templateArea,1)));
end
end
0 Comments
Answers (1)
Walter Roberson
on 10 Nov 2016
At various points, template_gradient(1) will have subtracted from it region_gradient(1,1), region_gradient(1,2), ... region_gradient(region_max_y-1,region_max_x-1) . You can arrange all of those in a row vector. You can do the same sort of thing for each template_gradient entry, construct the vector of values that will be subtracted from it in order. Then you can replicate each of the column vector region_gradient entries to a 2D array that is as wide as the array of coefficients. Now, subtract and abs() and sum along the columns. The result will be all of the values that correspond to c, simultaneously. Reshape that result.
See Also
Categories
Find more on Matrix Indexing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!