From: spellucci@mathematik.th-darmstadt.de (Peter Spellucci)
Newsgroups: sci.math.num-analysis
Subject: Re: Is there anything faster than Gauss-Jordan elimination?
Date: 9 Jan 1998 15:58:19 GMT
In article <34B61F01.11EE@pci.unizh.ch>, vadim writes:
|> Hi, everybody!
|>
|> Can someone give me a hint? I'm looking for a procedure of solution os a
|> system of linar equations. Unfortunatly Gauss-Jordan elimination is too
|> long because matrix is too big and has singularities. Is there anything
|> faster, some sort of iterations or whatever?
|> Thanks a lot, Vadim.
of course there is something "faster" than gauss-jordan, namely the
recursive use of block 2*2 matrix inversion combined with strassen's
fast matrix multplication, coming down from n**3 to oo(n^log_2 7),
see Num. Math. 13, (1969), 354 - 356.
but this seems not to be your true question. If your matrix is big,
(better say sparse, since for a full matrix little helps), then sparse
LU-elimination might help. but you also write about "singularities".
This is an undefined term (you surely do not mean that some matrix elements
are functions which have some (complex) singularities?) well, let us
assume that you mean "singular". In this case, neither Guass-Jordan
nor sparse Lu are whatsoever you might select applies. First you must
define what you want in this case. in general a singular system will
be incompatible and then the minimum norm least squares solution is
usually the poper selection. this may be computed (for large dimension)
e.g. by Kaczmarcs method, by a modification of LSQR (using svd of the
bidiagonal matrix rather than QR, as in Paige&Saunders), or by other iterative
techniques for minimizing ||Ax-b||**2 + \alpha ||x||**2. But you should
be aware of the fact that these techniques are slow in general, and that you
get an approximate solution only. What to do depends very much on the
specific application you have in mind.
hope this helps
peter