Re: [eigen] Status of the eigen solver

[ Thread Index | Date Index | More lists.tuxfamily.org/eigen Archives ]



arf, sorry my mistake, it was just an error of sign. This is what happens when doing too many things at the same time.

So this is (almost) done, here is the respective commit message:

> SVN commit 867430 by ggael:
>
> add EigenSolver::eigenvectors() method for non symmetric matrices.
> However, for matrices larger than 5, it seems there is constantly a quite large error for a very
> few coefficients. I don't what's going on, but that's certainly not due to numerical issues only.
> (also note that the test with the pseudo eigenvectors fails the same way)

gael

On Thu, Oct 2, 2008 at 5:23 PM, Gael Guennebaud <gael.guennebaud@xxxxxxxxx> wrote:

hm.. too !

actually, this is also the formula I found, but when I tried it, it did not work at all, so I though I was doing a big mistake ! So I tried again, but it is still far away to the correct results. This is my piece of code:

for (int j=0; j<n; j++) // foreach column
{
  if (ei_isMuchSmallerThan(ei_imag(m_eivalues.coeff(j)), ei_real(m_eivalues.coeff(j))))
  {
    // real eigen value
    matV.col(j) = m_eivec.col(j);
  }
  else
  {
    // a pair of complex eigenvalues
    for (int i=0; i<n; i++)
    {
      matV.coeffRef(i,j)   = Complex(m_eivec.coeff(i,j), -m_eivec.coeff(i,j+1));
      matV.coeffRef(i,j+1) = Complex(m_eivec.coeff(i,j),  m_eivec.coeff(i,j+1));
   }
   matV.col(j).normalize();
   matV.col(j+1).normalize();
   j++;
}

Note that the "pseudo" real eigenvectors and real block diagonal matrix are both correct (at least we have "A * pseudoV = pseudoV * pseudoBlockDiag"). So the eigenvectors/eigenvalues are correctly grouped, etc...


gael.



On Thu, Oct 2, 2008 at 4:43 PM, Benoît Jacob <jacob@xxxxxxxxxxxxxxx> wrote:
Hm.

I see, so the "real Schur form" is not doing (block) diagonalization, instead
it is doing (block) triangular-ization.
The eigenvalues
If I understand well our current algorithm produces

A = XSX^{-1}

where:

S is 2x2-block-diagonal
X is invertible, but not necessarily unitary

Right?

In other words our algorithm does 2x2 block diagonalization in a
not-necessarily-orthonormal basis in R^n.

Then I think that this is much much better than a "real Schur decomposition"
for the purpose of finding eigenvectors.

Indeed, suppose that one of the 2x2 blocks is at columns 2k,2k+1.

Let e_i denote the i-th vector of the canonical basis of R^n.
Let x_i = Xe_i denote the i-th column of X.
Let s_ij denote the ij-th coeff of S.

We have A x_i = XSX^{-1}Xe_k = XSe_i for all i.

So A x_{2k} = s_{2k 2k} x_{2k} + s_{2k+1 2k} x_{2k+1}

And likewise

A x_{2k+1} = s_{2k 2k+1} x_{2k} + s_{2k+1 2k+1} x_{2k+1}

Now put f_k = s_{2k 2k} = s_{2k+1 2k+1}
And put g_k = s_{2k+1 2k} = -s_{2k 2k+1}

We have

A x_{2k}   = f_k x_{2k} + g_k x_{2k+1}
A x_{2k+1} = -g_k x_{2k} + f_k x_{2k+1}

Put z_{2k}   = x_{2k} + i x_{2k+1}
And z_{2k+1} = x_{2k} - i x_{2k+1}

We have

A z_{2k}   = (f_k - i g_k) z_{2k}
A z_{2k+1} = (f_k + i g_k) z_{2k+1}

Conclusion: with the current decomposition A=XSX^{-1} with X not unitary, when
you have a 2x2 block in S
(a -b)
(b a )
at columns 2k and 2k+1,
the actual eigenvalues are of course a+ib and a-ib, and the actual
eigenvectors are respectively X_{2k} - i X_{2k+1} and X_{2k} + i X_{2k+1}.
The only "trap" is that the sign of i is reversed between eigenvalues and
eigenvectors.

So unless there's a definite need for the real schur form with orthogonal
matrices, I'd say we can forget about it.

By the way...

since A is selfadjoint we know that eigenvectors corresponding to different
eigenvalues are orthogonal. So in effect the eigenvectors thus constructed
will be orthogonal to one another. If we normalize them and take care of
possible complications in case of equal eigenvalues, we can ensure that the
eigenvectors thus constructed is really an orthonormal basis i.e. the matrix
Z is unitary.

Cheers,
Benoit

On Thursday 02 October 2008 12:15:38 Gael Guennebaud wrote:
> On Wed, Oct 1, 2008 at 6:17 PM, Benoît Jacob <jacob@xxxxxxxxxxxxxxx> wrote:
> > Thanks Gael for having done this. The pseudo- prefix is fine by me.
>
> ah, in fact they also correspond to the "real Schur decomposition":
>
> "A = VSV^T, where A, V, S and V^T are matrices that contain real numbers
> only. In this case, V is an orthogonal matrix, V^T is the transpose of V,
> and S is a block upper triangular matrix called the real Schur form. The
> blocks on the diagonal of S are of size 1×1 (in which case they represent
> real eigenvalues) or 2×2 (in which case they are derived from complex
> conjugate eigenvalue pairs)."
>
> The difference is that in our case the strictly upper part of the matrix S
> has been applied to the matrix V which, of course, is not unitary anymore.
> This transformation is applied at the end of the algorithm, so we can still
> access to the "true" real Schur form if needed.
>
> So now I guess we have to convert the real form to the complex form before
> extracting the "true" eigenvectors. The second step is well documented
> everywhere, but for the conversion I cannot find anything else than the
> manual page of the corresponding matlab function which, of course, is
> useless for us !
>
> my 2 cents,
> gael.
>
> > BTW, In the latest commit-digest, some screensavers get ported to eigen2:
> > http://commit-digest.org/issues/2008-09-21/
> >
> > Cheers,
> > Benoit
> >
> > On Wednesday 01 October 2008 12:32:00 Gael Guennebaud wrote:
> > > FYI:
> > >
> > > [12:18] <CIA-20> ggael * r866561 eigen2/trunk/kdesupport/eigen2/
> > > (Eigen/src/QR/EigenSolver.h test/eigensolver.cpp):
> > > [12:18] <CIA-20> Fixes in Eigensolver:
> > > [12:18] <CIA-20> * eigenvectors => pseudoEigenvectors
> > > [12:18] <CIA-20> * added pseudoEigenvalueMatrix
> > > [12:18] <CIA-20> * clarify the documentation
> > > [12:18] <CIA-20> * added respective unit test
> > > [12:18] <CIA-20> Still missing: a proper eigenvectors() function.
> > >
> > > would be nice to get a proper eigenvectors() function before 2.0,
> > > anybody
> >
> > ?
> >
> > > also the current implementation actually comes from: "Algol procedure
> >
> > hqr2,
> >
> > > by Martin and Wilkinson, Handbook for Auto. Comp.,  Vol.ii-Linear
> >
> > Algebra,
> >
> > > and the corresponding  Fortran subroutine in EISPACK", so I guess we
> > > can find further information about this issue there.
> > >
> > > Cheers,
> > > gael.
> > >
> > > On Thu, Sep 25, 2008 at 3:15 AM, Benoît Jacob <jacob@xxxxxxxxxxxxxxx>
> >
> > wrote:
> > > > OK I see...
> > > >
> > > > This seems like a very clever idea, but I am uncomfortable with that
> > > > stuff being called eigenvectors and eigenvalues, since that's not
> >
> > exactly
> >
> > > > what it is. I'm all for it as long as it has a slightly different
> > > > name.
> > > >
> > > > Unless, someone explains me how this variant is what's actually
> > > > useful
> >
> > in
> >
> > > > most
> > > > applications.
> > > >
> > > > Personnally I don't see any urgent need for that (though it interests
> >
> > me
> >
> > > > highly) so Gael if you're too busy you can ifdef that out for the 2.0
> > > > release, no problem. Somebody will eventually step up to help ;)
> > > >
> > > > I just think this must be sorted out cleanly before it enters our
> >
> > stable
> >
> > > > API.
> > > >
> > > > Cheers,
> > > > Benoit
> > > >
> > > > PS we're mentioned here,
> >
> > http://www.gamedev.net/community/forums/topic.asp?topic_id=509161&whichpa
> >
> > > >ge=1&#3318752
> > > >
> > > > On Thursday 25 September 2008 02:48:44 Gael Guennebaud wrote:
> > > > > yes, in fact this piece of code has been borrowed from Jama, and
> > > > > they use
> > > >
> > > > a
> > > >
> > > > > trick to keep the "eigenvectors" real.
> > > > >
> > > > > so this is their notes:
> > > > >
> > > > >     If A is symmetric, then A = V*D*V' where the eigenvalue matrix
> > > > > D
> >
> > is
> >
> > > > >     diagonal and the eigenvector matrix V is orthogonal. That is,
> > > > >     the diagonal values of D are the eigenvalues, and
> > > > >     V*V' = I, where I is the identity matrix.  The columns of V
> > > > >     represent the eigenvectors in the sense that A*V = V*D.
> > > > >
> > > > >    If A is not symmetric, then the eigenvalue matrix D is block
> > > > > diagonal with the real eigenvalues in 1-by-1 blocks and any complex
> > > >
> > > > eigenvalues,
> > > >
> > > > >     a + i*b, in 2-by-2 blocks, [a, b; -b, a].  That is, if the
> >
> > complex
> >
> > > > >     eigenvalues look like
> > > > > <pre>
> > > > >
> > > > >           u + iv     .        .          .      .    .
> > > > >             .      u - iv     .          .      .    .
> > > > >             .        .      a + ib       .      .    .
> > > > >             .        .        .        a - ib   .    .
> > > > >             .        .        .          .      x    .
> > > > >             .        .        .          .      .    y
> > > > > </pre>
> > > > >         then D looks like
> > > > > <pre>
> > > > >
> > > > >             u        v        .          .      .    .
> > > > >            -v        u        .          .      .    .
> > > > >             .        .        a          b      .    .
> > > > >             .        .       -b          a      .    .
> > > > >             .        .        .          .      x    .
> > > > >             .        .        .          .      .    y
> > > > >
> > > > >     This keeps V a real matrix in both symmetric and non-symmetric
> > > > >     cases, and A*V = V*D.
> > > > >
> > > > >
> > > > > I guess it should be fairly easy to extract the "true" eigenvectors
> > > > > from the current ones, but unfortunately, I won't have time to look
> > > > > into it soon.
> > > > >
> > > > > gael.
> > > > >
> > > > > On Thu, Sep 25, 2008 at 2:33 AM, Timothy Hunter
> > > >
> > > > <tjhunter@xxxxxxxxxxxx>wrote:
> > > > > > For your information, the eigen values returned seem correct (as
> > > > > > checked with random matrices against matlab's computations).
> > > > > > Also, the computed eigen vectors have no imaginary part and are
> >
> > different.
> >
> > > > > > On Wed, Sep 24, 2008 at 1:47 PM, Gael Guennebaud
> > > > > >
> > > > > > <gael.guennebaud@xxxxxxxxx> wrote:
> > > > > > > ah, thanks for the fix. Actually it is supposed to work for non
> > > > > > > symmetric matrices. At least it worked a while ago. I'll check
> >
> > that
> >
> > > > > > > when I'll have time....
> > > > > > >
> > > > > > > gael.
> > > > > > >
> > > > > > > On Wed, Sep 24, 2008 at 10:41 PM, Benoît Jacob <
> > > >
> > > > jacob@xxxxxxxxxxxxxxx>
> > > >
> > > > > > > wrote:
> > > > > > >> On Wednesday 24 September 2008 22:14:12 Timothy Hunter wrote:
> > > > > > >> > Hello,
> > > > > > >> > I would like to compute eigen values for non-symmetric
> >
> > matrices.
> >
> > > > > > >> > I
> > > > > >
> > > > > > tried
> > > > > >
> > > > > > >> > to
> > > > > > >> > use to the EigenSolver but it does not compile because it
> >
> > cannot
> >
> > > > > > >> > find the
> > > > > > >> > function .block(int, int)  (see attached log).
> > > > > > >>
> > > > > > >> OK, this function was recently renamed to segment(int,int).
> > > > > > >> Fixed in r864468.
> > > > > > >>
> > > > > > >> > Also, the
> > > > > > >> > tests for non self-adjoint matrices are disabled in
> > > > > >
> > > > > > test/eigensolver.cpp
> > > > > >
> > > > > > >> > .
> > > > > > >> > Does it mean this class is not ready yet?
> > > > > > >>
> > > > > > >> I tried re-enabling this test, and fixed it so it compiled,
> > > > > > >> and the
> > > > > >
> > > > > > result
> > > > > >
> > > > > > >> is
> > > > > > >> that with a real, non-selfadjoint matrix, the unit-test failed
> >
> > at
> >
> > > > line
> > > >
> > > > > > >> 117.
> > > > > > >> So please consider it as not yet ready for now.
> > > > > > >>
> > > > > > >> Gael, if it's not yet ready maybe #ifdef it out for now.
> > > > > > >>
> > > > > > >> Cheers,
> > > > > > >> Benoit






Mail converted by MHonArc 2.6.19+ http://listengine.tuxfamily.org/