Re: [eigen] Re: LU precision tuning

[ Thread Index | Date Index | More lists.tuxfamily.org/eigen Archives ]


Ah, very interesting.
A class named QR should guarantee Q square but it would be interesting
to have ThinQR alongside.
Cheers,
Benoit

2009/5/11, Márton Danóczy <marton78@xxxxxxxxx>:
>> Aha, the misunderstanding comes from an error in our current QR.
>>
>> Normally, in the QR decomposition, Q is always square, it's R that
>> adapts to the rectangular size.
>> http://en.wikipedia.org/wiki/QR_decomposition#Rectangular_matrix
>>
>> But here in our QR, it's Q that adapts to the rectangular size.
>>
>> Another problem --> so i wouldn't currently consider it reliable for
>> non-square matrices.
>
> Not necessarily. There's two versions of QR, in MATLAB's
> implementation it's qr(A) or qr(A,0), where the second is called
> "economy-size" QR, or "thin QR" by Golub & van Loan, see the wikipedia
> page. The second is useful when looking for the least-squares solution
> of overdetermined equations, i.e. when A is m x n and m>n. In this
> case, using a full size Q (m x m) is a waste of memory, it's better to
> have Q m x n and R n x n. If A is square or m<n, however, Q should be
> m x m and R m x n.
>
> Marton
>
>
>



Mail converted by MHonArc 2.6.19+ http://listengine.tuxfamily.org/