Vectoring a nested for loop containing matrix indexing

2 views (last 30 days)
I'm trying to optimize this piece of a code a bit, and usually my first instict is to vectorize it. I cannot for the life of me figure out how to do this. Is it even possible? I am leaning towards no, but I have a gut feeling that it can be done.
for i = 1:1:region_max_y-template_max_y
for j = 1:1:region_max_x-template_max_x
cc(i,j) = sum(abs(template_gradient(:)-...
reshape(region_gradient(i:i+template_max_y-1,j:j+template_max_x-1,:),...
templateArea,1)));
end
end

Answers (1)

Walter Roberson
Walter Roberson on 10 Nov 2016
At various points, template_gradient(1) will have subtracted from it region_gradient(1,1), region_gradient(1,2), ... region_gradient(region_max_y-1,region_max_x-1) . You can arrange all of those in a row vector. You can do the same sort of thing for each template_gradient entry, construct the vector of values that will be subtracted from it in order. Then you can replicate each of the column vector region_gradient entries to a 2D array that is as wide as the array of coefficients. Now, subtract and abs() and sum along the columns. The result will be all of the values that correspond to c, simultaneously. Reshape that result.
  1 Comment
Phillip Miller
Phillip Miller on 10 Nov 2016
The problem is, each value returned is dependent on the values around each pixel. (this is why two-dimensional indexing is present.)
A simpler way to present this problem is
for i = 1:1:alpha
for j = 1:1:beta;
A = sum(B(i:(i+5),j:(j+5)));
end
end
This would sum all values in a certain region of B, as defined by the for loop.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!