Vectorized iterative summation of matrices

Dear MATLAB community,
I want to iteratively sum over matrices and store my result. I defined a mask where every 1 value defines a position where I want to insert a matrix around it. If two inserted matrices overlap, I want to summarize the value. Doing this with a for loop is very easy but also very time consuming if I am dealing with large matrices. Therefore, I wanted to do this with a vectorized expression. However, it does not work as intended yet. Let me show you what I got:
% This defines the mask I am using for inserting the matrices. At each 1 I
% want to insert another matrix b
a=eye(10);
a(1,1)=0;
a(end,end)=0;
% define matrix b to be inserted
b = [1 2 3; 4 5 6; 7 8 9];
% define result matrix
c = zeros(10,10);
% now iteratively put the b matrix at each defined position by a and
% summarize overlapping parts
[ix,iy] = find(a==1);
c(ix-1:ix+1,iy-1:iy+1) = c(ix-1:ix+1,iy-1:iy+1) + b;
My problem is that MATLAB only inserts the matrix b at the very first pair of indices. For all other positions it does not work. As I said, for computational time reasons I wanted to find a vectorized expression instead of using loops. Does anyone have an idea how to do that?
Thank you in advance!

 Accepted Answer

Matt J
Matt J on 3 Dec 2021
Edited: Matt J on 3 Dec 2021
c=conv2(a,rot90(b,2),'same');

8 Comments

Thank you for your answer! I have also considered using a convolution but as far as I understood this is internally also performed by using discrete sums implemented by loops and therefore probably doesn't give me any speed advantage. Normal vectorized expressions (I think) are computed on C++ or C, which is inherently faster than MATLAB for loops and therefore also faster than convolutions, even though the convolution seems to be a little bit faster than just a foor loop (factor of 2).
Or am I wrong here?
Matt J
Matt J on 6 Dec 2021
Edited: Matt J on 6 Dec 2021
conv2() is well optimized. It will give you the best speed performance.
Thank you! Does this also hold true for convn? I have used a simplified example of a 2D matrix, however, later I want to use a 3D case.
Matt J
Matt J on 6 Dec 2021
Edited: Matt J on 6 Dec 2021
Yes, it is true for convn. There is no need to take my word for it, however. You could test these alternatives yourself and confirm their benefits directly.
Once you do, however, you should Accept-click the answer that works best for you.
Thank you Matt! I have already played around with convn (and convnfft, too) and can confirm, that they are faster than just using for loops. However, I was hoping that there might still exist some vectorized expression for my problam that might run even faster than convoluting. But I am afraid that for this specific problem, that is not possible. Would you agree?
Your operation is a convolution, so I don't think you will find anything faster than a command specifically written and optimized for convolutions. If you need more speed, and you have the Parallel Computing Toolbox, you could use gpuArrays instead of regular matrices.
Thank you very much for confirming that. I have already tried using gpuArrays, however, my matrix c is too large (8-9 GB) for the storage of my graphics card. Also, I thing that parallizing the problem on multiple cpu cores with a par for loop should still be slower than the convolution.
Maybe using single floats would help it fit.

Sign in to comment.

More Answers (1)

What is wrong with using a loop? Why do you think it is going to be slower?
a=eye(10);
a(1,1)=0;
a(end,end)=0;
b = [1 2 3; 4 5 6; 7 8 9];
d = zeros(10,10);
[ix,iy] = find(a==1);
for k=1:length(ix)
c=zeros(10,10);
c(ix(k)-1:ix(k)+1,iy(k)-1:iy(k)+1)=b;
d=d+c;
end

1 Comment

As far as I have understood it (correct me if I am wrong), vectorized expressions in MATLAB are computed in C++ (or C?) and therefore inherently much faster than "normal" for loops. I have noticed a tremendous difference for other computations where vectorization was easier than in this example.

Sign in to comment.

Categories

Find more on Programming in Help Center and File Exchange

Products

Release

R2021b

Asked:

on 3 Dec 2021

Commented:

on 6 Dec 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!