[eigen] accuracy of SVD

[ Thread Index | Date Index | More lists.tuxfamily.org/eigen Archives ]


We have been investigating various SVD methods, and have some questions about Eigen’s SVD. Eigen implements the 2-sided Jacobi SVD method, and states that it has “proven accuracy”. It also suggests that for a tall matrix, QR with column pivoting is reliable, but QR with complete pivoting should be used for “proven accuracy” [1].

Questions and concerns:
1) Are there references for those proofs?

2) The claim of “proven accuracy” is vague. What is meant by “accuracy”? This should be documented, minimally by a reference.

3) Our tests show that Eigen’s SVD does not achieve good relative accuracy (see below) on strongly-scaled square matrices, while LAPACK’s 1-sided Jacobi does. This seems to contradict the claims above. (We haven’t tested tall matrices, which would invoke QR with column or complete pivoting.)

4) In our tests, Eigen’s SVD is extremely slow, even compared to other Jacobi implementations.

5) When compiled with EIGEN_USE_LAPACKE, Eigen replaces 2-sided Jacobi SVD with LAPACK’s gesvd [2]. When using EIGEN_USE_LAPACKE_STRICT, it doesn’t use LAPACK.
However, if concerned about accuracy, it seems Eigen ought to call LAPACK’s 1-sided Jacobi, gejsv, which was added in LAPACK 3.2 (2008). While still slow compared to gesvd, gejsv is faster than Eigen, and has good relative accuracy.

It might be advisable to change to, or add, 1-sided Jacobi. It is probably more efficient, as it always accesses the matrix column-wise, and it is proven accurate as described below.

Let me clarify what I mean by accuracy for the SVD. Of course, QR iteration, divide-and-conquer, and Jacobi methods all have proven error bounds making them backward stable [3]. Generally what’s meant when discussing a Jacobi method is that it also has good relative accuracy, i.e., even small singular values are computed accurately. Demmel and Veselic [4] proved that:

1) The 2-sided Jacobi symmetric eigenvalue method has good relative accuracy on positive definite matrices.

2) The 1-sided Jacobi SVD method has good relative accuracy. (This method is equivalent to 2-sided Jacobi symmetric eigenvalue on A^T A.)

Specifically, for the SVD they showed if B = AD, where D is a diagonal scaling matrix and the columns of A have unit 2-norm, then

| sigma_i - sigma_i* | / sigma_i* < eta k_A

where sigma_i* are the true singular values of B, eta is a bound on the perturbation in A, and k_A is the condition number of A. Whereas the standard result for QR iteration, etc., would be

| sigma_i - sigma_i* | / sigma_i* < eta k_B.

Crucially, the condition number k_A can be much less than k_B. When k_D is large, making B a strongly scaled matrix, 1-sided Jacobi SVD still retains good relative accuracy, while QR iteration, etc. in general do not.

Demmel and Veselic did not prove results for the 2-sided Jacobi SVD method. Hari [2, 3] proved that 2-sided Jacobi SVD on certain triangular matrices does have good relative accuracy. They also used QR with column pivoting to attain such triangular matrices. (This suggests complete pivoting is not required.)

Of further interest is Drmac’s recent paper [7] showing that QR with column pivoting followed by QR iteration (gesvd) also has good relative accuracy, and is much faster than Jacobi methods.

We did tests using random matrices B = AD where A = USV^T, and U and V are random orthogonal matrices. S and D are random diagonal matrices, such that the log of entries are uniform on (log(1/k_A), 0) for S and (log(1/k_D), 0) for D. This is similar to tests in [4]. The results are attached. Each section of the graph has the same k_D but increasing k_A. As k_D becomes large (towards right), the 1-sided Jacobi methods stay below the dashed line (epsilon), showing good relative accuracy, while the accuracy of Eigen, QR iteration, etc. worsen, eventually having no correct digits at all.

Attachment: svd-rel-error.pdf
Description: Adobe PDF document


[1] http://eigen.tuxfamily.org/dox/group__TopicLinearAlgebraDecompositions.html#note3
[2] https://eigen.tuxfamily.org/dox/TopicUsingBlasLapack.html
[3] Trefethen and Bau. Numerical Linear Algebra, 1997. See Lecture 31.
[4] Demmel and Veselic. Jacobi’s method is more accurate than QR, 1992.
[5] Matejas and Hari. Accuracy of the Kogbetliantz method for scaled diagonally dominant triangular matrices, 2010.
[6] Matejas and Hari. On high relative accuracy of the Kogbetliantz method, 2015.
[7] Drmac. Algorithm 977: A QR–Preconditioned QR SVD Method for Computing the SVD with High Accuracy, 2017.

Apologies for the long email. Since our tests seem contradictory to claims on Eigen’s site, we wanted to clarify the issue, and also suggest some alternatives such as LAPACK’s gejsv that you may want to consider.

  -mark



Mail converted by MHonArc 2.6.19+ http://listengine.tuxfamily.org/