From: Tom Bech
Newsgroups: sci.math
Subject: Eigenvectors of a sym. pos. semidefinite 3x3 matrix
Date: 23 Mar 1995 16:52:56 GMT
Hi!
Given a 3x3 symmetric matrix on the form,
|a d e|
A = |d b f|
|e f c|
where A is positive semidefinite.
Other than the restrictions on the values of (a..f) for
the symmetric semidefinite case, no assumptions can be made on
the appearance of A.
Is there an "easy" (fast) way to determine the _exact_ eigenvalues and
_exact_ eigenvectors of A? (that is, on the form of a computer
program (algorithm) without resorting to numerical methods like
QR iteration, deflation etc.) Or perhaps some numerical method
exploiting the properties of A to rapidly compute a very good estimate
of the eigenvalues and eigenvectors of A? This is a time critical
part of a larger problem I'm working on, so this computation
should be as fast as possible.
It seems possible to solve det(lI-A)=0 and obtain
the eigenvalues exact (by generally finding the roots of a
3rd degree polynomia). My problem is then how to compute the
eigenvectors, that is the non-trivial solution(s) of the system
(lI-A)x=0, without dealing with numerous "special cases".
This may be a trivial problem, but any help or suggestions would
be appreciated.
Regards,
Tom
--------------------------------------
MSc Section, Department of Informatics
University of Bergen, Norway
--------------------------------------
==============================================================================
From: rusin@sinai.math.niu.edu (Dave Rusin)
Newsgroups: sci.math
Subject: Re: Eigenvectors of a sym. pos. semidefinite 3x3 matrix
Date: 30 Mar 1995 06:50:21 GMT
In article <3ks918$8d4@due.uninett.no>, Tom Bech wrote:
>Given a 3x3 symmetric matrix on the form,
[matrix is displayed]
>Is there an "easy" (fast) way to determine the _exact_ eigenvalues and
>_exact_ eigenvectors of A?
>It seems possible to solve det(lI-A)=0 and obtain
>the eigenvalues exact (by generally finding the roots of a
>3rd degree polynomia). My problem is then how to compute the
>eigenvectors, that is the non-trivial solution(s) of the system
>(lI-A)x=0, without dealing with numerous "special cases".
Well, there's always the Hamilton-Cayley theorem: if P(x) is the
characteristic polynomial you've already alluded to, then P(A) = 0.
Now, if r is an eigenvalue, then simple long division alows you to
factor P(x) = (x-r) Q(x); hence (A-r.I) Q(A) = P(A) = 0. In particular,
for any vector w, (A-rI). Q(A)w = 0, so that v=Q(A)w has the
property that Av = rv. I won't call v an eigenvector, since there is
a slight chance that v could be zero.
We can improve things a little using the symmetry of A. Using < , > for the
inner product, the defining property of a symmetric matrix is that
= for all vectors v and w. If v is an eigenvector
corresponding to the eigenvalue r, then
= r = = =
so that v is perpendicular to (A-r)w for any w. (The converse is also
true: if v is perpendicular to all these vectors then Av=rv: just
consider the case w=(A-r)v.) So you need only find the vectors perpendicular
to the image of A-rI. In the 3x3 case, I'd just apply A-rI to two of
the standard basis vectors and take a cross product. If you get the zero
vector, take cross products with (A-rI)(e3) as well. (If you still get zero,
that means the eigenspace is at least two-dimensional, in which case it is
hopeless to expect a single function to produce "the" eigenvector. But the
entire eigenspace is still found by looking at the three vectors (A-rI)(ei);
if all are zero, the eigenspace is all of R^3. If (A-rI)(e1) (say) is not
zero, the eigenspace is the span of the cross-products (A-rI)(e1) x ei
(precisely two of which will be linearly independent).
I concede that this is producing a number of "special cases", but in
compensation I'll mention that for most cases you get the eigenvector
quickly as
(A-rI)(e1) x (A-rI)(e2) = (Ae1 x Ae2) - r (e1 x Ae2 + Ae1 x e2) + r^2 e3
where the coefficients of 1 and r are easily computed independent of r
(the coefficient of r is (e)e1+(f)e2+(-a-b)e3, for example).
(The alert reader will note that I really didn't improve things much over
the Hamilton-Cayley result, although the restriction to symmetric matrices
and dimension 3 does allow a little more geometry in the discussion.)
Of course, this discussion presumes the calculation of the eigenvalues
has been done; in practice I imagine that step will take as long as the
admittedly less intuitive manipulations above. I will second the comments
of another poster: you might not really need the eigenvalues or eigenvectors
anyway -- simply to know they exist is sometimes sufficient.
dave