wrote:
>Suppose you have arbitrary complex rectangular matrices
>A and A1, A2, ... An.
...of a fixed size p x q say
>Can you find complex vectors x, u, and v so that
>
>A + x(1)*A1 + ... + x(n)*An = u*v'
>
>In other words, the problem is: "what is the affine combination
>(constant matrix plus linear combination of other matrices)
>of a set of matrices which is closest to rank 1"?
>
>Do you know of any ways to solve this efficiently?
>
>This seems related to LMIs, but I'm not sure exactly how.
I don't even know what LMI's are, so I was hoping someone else
would step in here, but no one has stepped up to the plate I thought I'd
throw in my two cents:
You can view this as the question of mapping C^(p+q) into
C^(p*q) with the quadratic map F(u,v) = u*v' - A and asking for the
point in the image which is closest to the subspace
W = span( A1, ..., An ); you want to know if the two intersect, for example.
Well, the image of F will be (p+q-1)-dimensional (you lose a dimension
since F( cu, (1/c)v ) = F(u,v) for all c<>0), so algebraic geometry ensures
there is a (complex) point of intersection for any A as soon as
dim(W) >= pq - (p+q-1) = (p-1)(q-1) . (Nothing smaller will work in general).
I suppose it would make sense, then, to Gram-Schmidt your way through
the A_i to determine a unitary basis {e_i} of W and in particular
the dimension of W. If the dimension is too small to guarantee an
intersection, you can still find the points of closest approach using
projection: the point of W closest to p = F(u,v) is
proj( p ) = Sum e_i. So you have only to choose
u and v to minimize the length of F(u,v) - Sum(e_i).
So you're left trying to minimize the norm of a quadratic polynomial map
F : C^(p+q) -> C^(pq). I'm not the one to say how efficiently this can
be done in practice, but in principle it's not so bad since setting the
derivatives to zero gives p+q equations to solve in p+q variables, each
equation of low degree and even linear in some variables. (One must normalize
one variable to 1, say, since the solution set is degenerate otherwise.)
If W has sufficiently high dimension, it pays to complete to a basis
of C^(pq); for example if dim(W)=(p-1)(q-1) and you now want to compute the
corresponding u and v, then you need each of the coordinates
to vanish, where the e_i now range over the (p+q-1)-dimensional
basis of W^\perp. That's (p+q-1) equations in (p+q-1) unknowns
(after again normalizing so that u(1)=1, say). Each equation is only a
quadratic polynomial, so most methods of numerical solution (e.g. Newton's
method) should work effectively.
I tried to think of a way to take advantage of the fact that this is not
just C^(pq) but the space of matrices; nothing came to mind except
left- and right- multiplying by orthogonal matrices so that A is a diagonal
matrix. (If you intend to find points of intersection rather than
closest approach, you can go whole hog and pre-multiply by anything
invertible, thus reducing the diagonal entries of A to 1's and 0's.)
Of course, with the x(i) _known_, not variable, the best u and v are
found with SVD, but I didn't see how to use that information in any
practical way.
dave