Re: [eigen] Re: LU precision tuning

[ Thread Index | Date Index | More Archives ]

2009/5/11, Hauke Heibel <hauke.heibel@xxxxxxxxxxxxxx>:
>> I'll commit that tomorrow, just a slight
>> modification of your patch to default to "epsilon*size".
> You have to take care here. In case the user passes his own precision,
> it needs to ensured that that will be used instead of the pre-defined
> epsilon*size. The default parameter for the ctor can not be set to the
> size of the matrix and thus I would set it to zero and in case
> precision<=zero is passed to the function we set
> precision=precision*size.

Yep, we'll have 2 separate ctors, but they'll both call a common
compute() function.

>> Also, there's a problem in your create...() function for rectangular
>> sizes: it's d that should be rectangular, not a; the matrix m you
>> create is always square as a product of 3 square matrices. I'll take
>> care of that.
> I don't get it. I create 'a' being row \times cols and 'd' being cols
> \times 1 and finally 'b' cols \times cols. Then, after multiplication
> you arrive at
> m = a * d.asDiagonal() * b
> (rows cols) * (cols cols) * (cols cols) = (rows cols)

Aha, the misunderstanding comes from an error in our current QR.

Normally, in the QR decomposition, Q is always square, it's R that
adapts to the rectangular size.

But here in our QR, it's Q that adapts to the rectangular size.

Another problem --> so i wouldn't currently consider it reliable for
non-square matrices.

(I mark that on my TODO...)


Mail converted by MHonArc 2.6.19+