On Fri, Nov 13, 2009 at 2:25 PM, Márton Danóczy
<marton78@xxxxxxxxx> wrote:
Hi,
I've encountered some (possibly not so?) weird behaviour when keeping
references to blocks:
template <typename X>
void f(const MatrixBase<X>& x)
{
typedef typename Block<X, ei_traits<X>::RowsAtCompileTime, 1> col_t;
for (int i=0; i<x.cols(); ++i)
{
//using this line produces strange (undefined) results
col_t xi(x, i);
this is because the ctors of Block<X, ...> expect a const X& and the class Matrix has a non explicit ctor from any MatrixBase<> object. So if the type of X is Matrix, then x is implicitely evaluated to a temporary Matrix object... One global solution to detect this mistake at compile time (and abord compilation) would be to make this Matrix's ctor explicit. But then if someone write a function expecting a Matrix, e.g.:
void foo(const MatrixXf& x);
then the following wont work:
foo(a+b);
The user would have to explicitely write either:
foo((a+b).eval());
foo(MatrixXf(a+b));
I'm not saying that's a bad solution, but I'm sure many users will find this behavior cumbersome.
So, the other solution is to generalize what Benoit did for the Replicate's ctor to all expressions. The principle is to make the _expression_ ctor generic on the nested _expression_ type and explicitely check that the given type and the expected type are the same. Perhaps we could also allow to automatically cast a MatrixBase<X> to X, that would make "col_t xi(x, i);" work as expected.
Gael.
//this is the correct way to do it:
col_t xi(x.derived(), i);
do_something(xi);
}
}
It was quite hard to hunt down this bug. Apart from that I don't
understand what's happening here, is there a way to prevent the user
from making this mistake?
Thanks,
Marton