Re: [eigen] Bug(s) report: SparseQR on tall-thin matrices

[ Thread Index | Date Index | More Archives ]


this is mainly an issue of "full QR" versus "thin QR", and the actual implementation seems to be inconsistent here. If the input matrix is m x n, m>=n, then:

qr.matrixQ().cols() == n


SparseMatrix Q = qr.matrixQ();
Q.cols() == m

To be consistent with the dense world,  qr.matrixQ().cols() should be equal to m by default plus some mechanism to extract the thin part. We could also add a thinQ() method. Then regarding matrixR(), we could add a thinR() method for convenience.


On Mon, Jan 9, 2017 at 3:51 PM, Julian Kent <julian.kent@xxxxxxxxxx> wrote:
While trying to use SparseQR on a matrix A with rows > cols, I found 2 bugs:

1) The size of qr.matrixR() is m x n, instead of n x n as expected. SparseQR.h:305 initialises m_R with size (m,n), and nothing does any resizing. For now I'm just taking the topRows(n), but I'm not entirely sure this is correct, and it certainly isn't the behaviour I expect. Shouldn't there be a non-destructive resize, if the extra rows are really necessary for intermediate procesing?

2) qr.matrixQ() claims to be size m x n, as expected. However, trying to multiply qr.matrixQ() with a n x k dense matrix gives an assertion error:
Eigen/src/SparseQR/SparseQR.h:640: void Eigen::SparseQR_QProduct<SparseQRType, Derived>::evalTo(DesType&) const [with DesType = Eigen::Matrix<double, -1, -1>; SparseQRType = Eigen::SparseQR<Eigen::SparseMatrix<double>, Eigen::NaturalOrdering<int> >; Derived = Eigen::Matrix<double, -1, -1>]: Assertion `m_qr.m_Q.rows() == m_other.rows() && "Non conforming object sizes"' failed.
In the .solve(...) only matrixQ.transpose() is used, which is probably why this hasn't shown up earlier.

These bugs may be interacting with each other to fool any accuracy tests using A*P = Q*R on tall-thin matrices, with the extra rows in R passing the assert in Q.

Let me know if you need example matrices to work with.

Julian Kent

Mail converted by MHonArc 2.6.19+