Re: [eigen] SVD with singular matrices

[ Thread Index | Date Index | More lists.tuxfamily.org/eigen Archives ]


2010/9/30  <hamelin.philippe@xxxxxxx>:
> Knowing there is a major problem with the current SVD, I think it shouldn't be available in the API until the problem is resolved. So both options are valid and the choice mainly relies on that question: in the long term, do you want to maintain 2 SVD algorithms ?

Yes, I do! Because any SVD implementation is a trade-off between
reliability and speed. There are two major families of algorithm:

1. Jacobi methods (reliable and typically slow)
2. Bidiagonal methods (not so reliable and typically faster)

Our JacobiSVD represents the extreme stance on reliability. It is even
more accurate than most other implementations, because it is two-sided
Jacobi, meaning that the unitaries U and V are computed separately,
which is more accurate than the more usual one-sided jacobi approach
(deduce V from U). It allows to reconstruct the original matrix to
roughly machine precision, regardless of matrix size. (TODO: add this
as a unit test).

Inside of 2. there are sub-categories:
 2a. Bidiagonalization + iteration using Givens rotations.
     ---> that's not very reliable. It's the approach of LAPACK's
SGESVD, and also of our SVD class. But our SVD class has this
additional bug making it fail on exactly singular matrices.
 2b. Bidiagonalization + divide-and-conquer
    ---> that's the approach of LAPACK's SGESDD and the one I want to
follow for Eigen's new SVD. It takes out the unreliability of the
iterative part. So it's both faster and more reliable than 2a. But,
there remains the inherent unreliability of bidiagonalizing, making it
forever less reliable than JacobiSVD.

Benoit

>
> Philippe
>
> -----Message d'origine-----
> De : Listengine [mailto:listengine@xxxxxxxxxxxxxxxxx] De la part de Benoit Jacob
> Envoyé : 30 septembre 2010 07:27
> À : eigen@xxxxxxxxxxxxxxxxxxx
> Objet : Re: [eigen] SVD with singular matrices
>
> 2010/9/30 Benoit Jacob <jacob.benoit.1@xxxxxxxxx>:
>> 2010/9/29 Keir Mierle <mierle@xxxxxxxxx>:
>>> On Wed, Sep 29, 2010 at 8:10 AM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx>
>>> wrote:
>>>>
>>>> Here's a diff for the SVD unit test, exposing the problem.
>>>>
>>>> The sorting of eigenvalues isn't the problem, the bidiagonalization is..
>>>>
>>>> The most productive thing I can say is: let's declare SVD
>>>> not-for-exactly-singular-matrices for now, and ... yeah , yeah, make
>>>> the new SVD happen :-P .. in the meanwhile you have JacobiSVD.
>>>>
>>>
>>> This is a serious problem. Almost all multiview geometry hackers regularly
>>> solve matrices that are exactly singular (i.e. start out as 3x5, get
>>> extended with two rows of zeros to be 5x5). This is because many problems in
>>> multiview geometry produce homogenous linear equations. I use the SVD now in
>>> libmv and it seems to work, but now I'm worried.
>>
>> I know, that's why in my next e-mail I proposed that we temporarily
>> internally replace the SVD implementation by JacobiSVD. That will turn
>> it into the most reliable SVD ever, the downside being that for large
>> matrices it's relatively slow (IIRC twice slower than the current SVD)
>> (but for small matrices such as yours, it's fast, so you should be
>> using JacobiSVD anyway).
>
> Actually I am now pretty convinced that we should do this, the
> question I'm really wondering about is the following. Right now I'm
> making API changes making JacobiSVD be a drop-in replacement for SVD.
> So:
>  - 1) should we just plain remove SVD and tell people to use JacobiSVD
>  - 2) or should we, as said above, just internally reimplement SVD on
> top of JacobiSVD ?
>
> I'm getting more and more in favor of 1). Like in other modules, use
> explicit decomposition names. My concern is that with 2), in Eigen 3.1
> we'd silently replace SVD by a faster but NOT AS RELIABLE
> implementation. Dangerous!
>
> Benoit
>
>>
>> Benoit
>>
>>
>>> Keir
>>>
>>>>
>>>> Benoit
>>>>
>>>> 2010/9/29 Benoit Jacob <jacob.benoit.1@xxxxxxxxx>:
>>>> > OK, i've attached a compilable variant that also prints the
>>>> > reconstructed matrix.
>>>> >
>>>> > It turns out that the reconstructed matrix is
>>>> >  1 0
>>>> >  1 0
>>>> >
>>>> > I.e. we seem to have a problem with row/column permutations. Which
>>>> > could be related to the sorting of singvals.
>>>> >
>>>> > I'm trying to understand why our unit test is not catching this. It's
>>>> > true that we're not testing matrices with exactly 0 singvals.
>>>> >
>>>> > Benoit
>>>> >
>>>> > 2010/9/29  <hamelin.philippe@xxxxxxx>:
>>>> >> This simple test fails:
>>>> >>
>>>> >> bool testSingularMatrixSVD()
>>>> >> {
>>>> >>    MatrixXd a(2,2), avalidation(2,2);
>>>> >>    unsigned int rows = 2;
>>>> >>    unsigned int cols = 2;
>>>> >>
>>>> >>    a <<    1,     1,
>>>> >>            0,     0;
>>>> >>
>>>> >>    Eigen::SVD<MatrixXd> svd(a);
>>>> >>    MatrixXd sigma = MatrixXd::Zero(rows,cols);
>>>> >>    MatrixXd matU  = MatrixXd::Zero(rows,rows);
>>>> >>    sigma.diagonal() = svd.singularValues();
>>>> >>    matU = svd.matrixU();
>>>> >>    avalidation = matU * sigma * svd.matrixV().transpose();
>>>> >>
>>>> >>    cout << "testSingularMatrixSVD() : ";
>>>> >>    if((a).isApprox(avalidation, 1e-12))
>>>> >>    {
>>>> >>        cout << "Success." << endl;
>>>> >>        return true;
>>>> >>    }
>>>> >>    else
>>>> >>    {
>>>> >>        cout << "Fail." << endl;
>>>> >>        return false;
>>>> >>    }
>>>> >>
>>>> >> }
>>>> >>
>>>> >>
>>>> >> -----Message d'origine-----
>>>> >> De : Listengine [mailto:listengine@xxxxxxxxxxxxxxxxx] De la part de
>>>> >> Benoit Jacob
>>>> >> Envoyé : 29 septembre 2010 10:32
>>>> >> À : eigen@xxxxxxxxxxxxxxxxxxx
>>>> >> Objet : Re: [eigen] SVD with singular matrices
>>>> >>
>>>> >> 2010/9/29  <hamelin.philippe@xxxxxxx>:
>>>> >>> Hello,
>>>> >>>
>>>> >>> when updating from eigen2 to eigen3, I found that my pseudo-inverse
>>>> >>> was not
>>>> >>> working correctly for singular matrices. However, just replacing SVD
>>>> >>> with
>>>> >>> JacobiSVD makes it work. Before looking further, is there any
>>>> >>> limitation
>>>> >>> with SVD with singular matrices such as:
>>>> >>>
>>>> >>> [1 1;0 0]
>>>> >>
>>>> >> JacobiSVD always works, and is always precise. But it's slow for large
>>>> >> matrices.
>>>> >>
>>>> >> SVD uses the Golub-Kahan bidiagonalization approach. So it's faster,
>>>> >> but potentially inaccurate for certain singular matrices.
>>>> >>
>>>> >> I would be surprised though if the above 2x2 matrix was enough to
>>>> >> expose problems in it. That would be a bug. Can you make a compilable
>>>> >> test case?
>>>> >>
>>>> >> Thanks,
>>>> >> Benoit
>>>> >>
>>>> >>>
>>>> >>> Thank you,
>>>> >>>
>>>> >>> ------------------------------------
>>>> >>> Philippe Hamelin, ing. jr, M. Ing
>>>> >>> Chercheur / Researcher
>>>> >>>
>>>> >>> T: 450-652-8499 x2198
>>>> >>> F: 450-652-1316
>>>> >>>
>>>> >>> Expertise robotique et civil
>>>> >>> Institut de recherche d'Hydro-Québec (IREQ)
>>>> >>> 1740, boul. Lionel-Boulet
>>>> >>> Varennes (QC) J3X 1S1, Canada
>>>> >>>
>>>> >>
>>>> >>
>>>> >>
>>>> >>
>>>> >>
>>>> >
>>>
>>>
>>
>
>
>
>
>



Mail converted by MHonArc 2.6.19+ http://listengine.tuxfamily.org/