[eigen] Golub-Kahan bidiagonalization |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: [eigen] Golub-Kahan bidiagonalization
- From: Benoit Jacob <jacob.benoit.1@xxxxxxxxx>
- Date: Tue, 28 Apr 2009 14:28:25 +0200
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:date:message-id:subject :from:to:content-type:content-transfer-encoding; bh=H6TtjHvdogZsQv/LyKbSicfePigxpcaIz4rGvq9H4HU=; b=hzXhquN3s7uFXc+2wz+Lbzfl8IZGOrYoVcBpOw69mnXJeHaKLeoe6+CR1EXLNBE9d3 YdIvWLC5rD9zkgD1k1ZL7GsGQ1GYTDDDnN37+zt359SmlRp/cof+kS/Fx8G3n2r8d/5p cMR+VkXVjqvKjvVYLrlE7qZiE6eYFOIlsLG9s=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:date:message-id:subject:from:to:content-type :content-transfer-encoding; b=euOSol1HGzcma0DrfQByl17yOAv1s3Q94HBIxOY5BH2qvZr0aLUuUNO0VGuJQUrTjN p6QlDMDXLPBBnw6ePyFGudV/d/c+lNjHNZWCiMRLQxvG4pyliQ2usBSIPwFDYDHRN74U OtB/jTLZeljcebfmJUm9CsQ2/Rw5PGf071/VU=
Hi,
this is mostly a followup to an old conversation with Keir as things
are a little clearer in my mind now, but also to gather some feedback
if there is an expert around.
what i discuss here applies equally to tridiagonalization and
Hessenberg reduction.
1) instability in the Golub-Kahan bidiagonalization algorithm.
This is algorithm 5.4.2 in Golub&vanLoan, called "Householder
bidiagonalization".
It consists in left-multiplying by a Householder reflection to put the
first column in the form (x,0,...,0), then right-multiplying by
another Householder reflection to put the remainder of the first row
(that is, without the 1st entry) in the form (y,0,....,0), and so on.
The problem that I see is that if some column is close to zero, then a
small perturbation of it results in a large perturbation of the
corresponding Householder vector. This is because the Householder
vector is normalized, and so this amounts to normalizing a near-zero
vector. Additional instability comes from the fact that this involves
taking the square root of a near-zero number (to compute the norm).
2) proposed solution: full pivoting
Alongside plain bidiagonalization, we could offer a variant doing as
follows. It would be slower but more reliable.
At every step, before computing the Householder vector of a column, we
could look up the (squared) norms of all remaining columns and permute
columns to make the biggest remaining column the one that we are
putting in form (x,0....,0). Same for rows. These permutation matrices
being unitaries, fit perfectly in the decomposition which is already
of the form UAV with U,V two independent unitaries.
Since Golub-Kahan bidiagonalization already has (8/3)n^3 complexity,
the overhead of full pivoting here would be a bit lower than it is for
LU.
3) API for that.
standard Bidiagonalization and pivoting Bidiagonalization would be 2
separate classes, but they could have a common base class abstracting
that difference away from the SVD. The QR iteration to compute the SVD
is the same in both cases.
Cheers,
Benoit