On Mon, Aug 16, 2010 at 5:43 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx
> 2010/8/16 Carlos Becker <carlosbecker@xxxxxxxxx
>> On Mon, Aug 16, 2010 at 2:34 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx
>>> 2010/8/16 Carlos Becker <carlosbecker@xxxxxxxxx
>>> > Hi everyone again. Lately I've been using Eigen with complex numbers to
>>> > calculate things such as calculating the maximum of the real part or
>>> > imaginary part of a certain vector. I suppose that trying to take the
>>> > real
>>> > part and then use maxCoeff() or minCoeff() would create a possible
>>> > unnecessary temporary.
>>> No, it wouldn't :-)
>> Ok good, maybe 'temporary' is not exactly what I wanted to say. I mean that
>> maybe there are vectorized ways to do certain stuff and special
>> implementations could be provided for such operations.
>>> > Another thing is to be able to do something like:
>>> > MatrixXf nn,xx;
>>> > xx = nn.colwise().normalized();
>>> Hm. If we start going that route, won't we end up offering every
>>> possible vector operation as colwise() operation?
>>> The biggest reason not to do it now is that it really isn't necessary
>>> to release Eigen 3.0 .... ;-)
>> Yes I understand. I think vector-wise operations could be very important,
>> but they would probably require a bit of discussion before implementing them
>> (ie: should we keep .colwise(), should we use .vectorwise(), in the case
>> some names for the reductions and vector-wise operations are the same). I
>> believe these kind of operations would be very useful since many times a set
>> of vectors is stored in a matrix, so this way the user could benefit from
>> extra performance.
> But the thing is, for this kind of use case, the much better approach
> is to use a matrix-of-arrays:
> Matrix<ArrayXf, 3, 1>
> then all operations are automatically supported and vectorized.
> The problem is that right now there is an inefficiency problem as we
> create useless temporaries with Scalar=Array. This is the problem that
> is the most worth fixing.