Solving two variables in a matrix problem (A + xBe^y) = C

Hi there,
I have a matrix, C, which is an nxn intensity map. It is the superposition of two other functions that are represented by nxn matrices A and B, related via:
abs(A + x*B*exp(1i*y)).^2 = C
I can get upper and lower bound estimates for x and y from the information in C; A and B are also static (i.e. can be treated as constant matrices) and pure real. The variables I need to solve for are x and y.
Currently I'm doing an iterative method (i.e. two nested for loops), and it works, but the problem is that it's too slow and possibly not accurate enough for my intended use (running it once is fine, takes about 30 seconds or so, but this will not be sufficient if it needs to work in close to realtime).
Is there any built in function or a common method for doing this?

4 Comments

x and y are real scalars? So you have n^2 equations in 2 unknowns and you want a least squares solution?
How large is n typically, and how fast were you hoping for it to be?
Yes, x and y should be real scalars, with x>0 and 0<y<pi. It's not so much n^2 equations as it is that the data being analyzed is an NxN intensity map comprised of the superposition of two intensity maps, A and B, which are predetermined.
n is typically 150-200 (pixels), and while this may be wholly unrealistic, less than a second solution would be great as we can generate 4-6 images in 1.2s.
The solution does not have to be least squares as long as it provides a match.
It's not so much n^2 equations as it is that the data being analyzed is an NxN intensity map comprised of the superposition of two intensity maps, A and B, which are predetermined.
I'm not seeing the distinction that you're seeing. You have n^2 things on the left hand side, each depending on unknowns x,y, and n^2 target values on the right hand side. That's precisely the definition of a system of n^2 equations in 2 unknowns.
The solution does not have to be least squares as long as it provides a match.
I hope when you say a "match", you mean some kind of best fit. When you have more unknowns than equations, there's the distinct possibility that they can't all be simultaneously satisfied.
I understand - the distinction was only made in case it was important or changed anything.
And yes, a best fit, but because the criteria is constrained there should always be a solution (in fact, infinitely degenerate solutions as well). The solution will only ever be comprised of these two matrices, combined in this manner, so there has to be one. I just can't find a fast way of solving it, except maybe a LUT instead.

Sign in to comment.

Answers (4)

Starting with n-by-n arrays A, B, and C this uses the technique I previously described to find the x and y parameters creating the least sum of squares differences between abs(A + x*B*exp(1i*y)).^2 and C, and it adheres to the constraints x >= 0 and 0 <= y <= pi. The algorithm does not use iteration, so it should be comparatively fast.
% Get coefficients
a = A(:); b = B(:); c = C(:);
d = a.^2-c; e = a.*b; f = b.^2;
D = sum(d.*e); E = sum(e.^2); F = sum(f.*e);
G = sum(d.*f); H = sum(f.^2);
% Compute X and Z
X = 0; Z = 0;
x = (D*F-E*G)/(E*H-F^2);
if x > 0
x = sqrt(x);
z = 1/2*(F*G-D*H)/(D*F-E*G)*x;
if abs(z)<=1
X = [X,x];
Z = [Z,z];
end
end
r = roots([H,3*F,G+2*E,D]);
for k = 1:3
if imag(r(k))==0
X = [X,abs(r(k))];
Z = [Z,sign(r(k))];
end
end
% Get x & y with least sum of squares
[~,ix] = min(sum((d*ones(size(X))+2*e*(X.*Z)+f*(X.^2)).^2));
x = X(ix); % "Best" parameters, x and y
y = acos(Z(ix));
You are trying to equate these two quantities:
a(i,j)^2 + 2*a(i,j)*b(i,j)*x*cos(y)+b(i,j)^2*x^2
and
c(i,j)
for all i and j.
As Matt points out, you have n^2 equations to satisfy and only the two scalar unknowns x and y to vary. You have little chance of succeeding in such an endeavor, so all you can hope for is to minimize some measure of the differences between these.
Minimizing the sum of the squares of their differences is one very common method of proceeding in such problems. In this case that can be accomplished by setting to zero the two partial derivatives of this sum with respect to x and to y. This creates two equations in the two unknowns and is a problem that should certainly be able to be solved in far less than a second.
Looking at the two equations I see that one of them can be solved for cos(y) in terms of a rational function of x. When this is substituted for cos(y) in the second equation, what remains after clearing out the denominator is a polynomial in x, and that can be solved using matlab's 'roots' function. Provided you can find a good rationale for choosing the correct root, you ought to be able to find the right solution within microseconds, not seconds. If necessary you can try out every real root and choose the one the gives the least sum of squares. Piece of cake!

3 Comments

I glossed over the constraints in your problem. There are two of significance here. One is the stated x>=0 and the other is that -1<=cos(y)<=1. In the second constraint the rational function of x I referred to earlier must lie within the range [-1,1]. Each limit provides another two possible solutions for x in the form of a simple quadratic equation, and the associated y must be either 0 or pi. The case x = 0 provides yet a fifth possibility which is independent of y. You can add these five possibilities to the testing you need to perform in testing for all possible solutions.
The above situation is analogous to finding the minimum of a function of a single variable in which the function's independent variable is constrained to lie between two limits. Either the minimum will occur at a point where the derivative is zero or else at one of the two constraining limits. Your minimum will occur either at a point where both partial derivatives are zero or else at a point limited by one of those constraints.
The constraints turn out to be circumventable. The equation depends on x and y only through x*exp(1i*y), which is the generic form for any arbitrary complex number in polar coordinates. You can therefore re-parametrize the problem via a change to Cartesian variables
x*exp(1i*y) = p1+i*p2
where p1 and p2 are completely unconstrained.
where p1 and p2 are completely unconstrained.
Sorry, I didn't notice that y can only go from 0 to pi. So, in Cartesain coordinates, p2 must be constrained to be positive. This doesn't end up mattering, though, because after the change of variables, the equations only depend on p2 through p2^2. So, the constraint ends up being superfluous...
Setting partial derivatives to zero leads to two simultaneous 4th order 2D equations. I suppose the Symbolic Toolbox could handle that, if available. You might also need to use the 2nd partials derivatives to rule out maxima.

Sign in to comment.

Matt J
Matt J on 7 Jun 2013
Edited: Matt J on 7 Jun 2013
Here is my solution. It is still iterative, using FMINSEARCH, but for n=200, it completes in typically 0.1 sec on my machine
Bv2=B(:).^2; %pre-compute
Av=A(:);
Bv=B(:);
Cv=C(:);
fun=@(p1) objective(p1,Av,Bv,Bv2,Cv);
tic;
p1=fminsearch(fun, 0);
[~,x,y]=fun(p1); %retrieve x and y
toc;
%Elapsed time is 0.100534 seconds.
function [f, x, y,p1,p2]= objective(p1,A,B,B2,C)
q1=(A+B*p1).^2;
p2=max(B2\(C-q1),0);
f=norm(q1+p2*B2-C);
if nargout>1
z=p1+1i*sqrt(p2);
x=abs(z);
y=angle(z);
end
end
Matt J
Matt J on 9 Jun 2013
Edited: Matt J on 9 Jun 2013
The solution does not have to be least squares as long as it provides a match... The solution will only ever be comprised of these two matrices, combined in this manner, so there has to be one.
If you're saying here that an exact solution to all n^2 equations always exists, then you don't have to process all n^2 entries in the matrices. You can solve for x,y (or p1 and p2) using just two of the n^2 equation, or in other words using just 2 entries from each matrix, regardless of n. This should make any method you use run in the blink of an eye, whether it be your original method, my proposal, or Roger's.
When I do this on my machine, my iterative solution runs in 0.003 sec for n=200 and Roger's closed-form solution runs in 0.0003 sec.

Asked:

on 7 Jun 2013

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!