|Re: [eigen] Optimization for special complex operations|
[ Thread Index |
| More lists.tuxfamily.org/eigen Archives
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: Re: [eigen] Optimization for special complex operations
- From: Gael Guennebaud <gael.guennebaud@xxxxxxxxx>
- Date: Tue, 17 Aug 2010 09:22:04 +0200
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:mime-version:received:in-reply-to :references:from:date:message-id:subject:to:content-type; bh=U5SWsFP678uPK5lfysnP08Ov+1v5wA70XLwY8+9ss0w=; b=CIro5mavhOsDmV6QdtsVvQm/59pb4NPaO9OrXJuDo0MCxg+nD0+WYJ1A8HYNuEGhuz PDumw+8he4EPFEj6qfnY5g83dt9LC0FaIrQRGKAXbX+0Hj2SEsdpXKL+1ImIcdxLOcpF y9Ce0BOLSf918XQOHGE/zj1b1QJ4Srgle4JuM=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; b=rJLfqBwRP1pCdlwwalW1l0xAFEvFuS0r7S2+dvZ0cXRzLvjflEa8mIEgVCwK9qZ49g FGteZac02T+td8bpkmD+iKdL6nhCjRu/39mioE2mc2uRbpFYbtEXFjuXxebEqR2Yk3Bu kd0APGFTFM6e+V6PAy6YEiD3UKPOIzL16+w/Y=
On Mon, Aug 16, 2010 at 5:43 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx> wrote:
> 2010/8/16 Carlos Becker <carlosbecker@xxxxxxxxx>:
>> On Mon, Aug 16, 2010 at 2:34 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx>
>>> 2010/8/16 Carlos Becker <carlosbecker@xxxxxxxxx>:
>>> > Hi everyone again. Lately I've been using Eigen with complex numbers to
>>> > calculate things such as calculating the maximum of the real part or
>>> > imaginary part of a certain vector. I suppose that trying to take the
>>> > real
>>> > part and then use maxCoeff() or minCoeff() would create a possible
>>> > unnecessary temporary.
>>> No, it wouldn't :-)
>> Ok good, maybe 'temporary' is not exactly what I wanted to say. I mean that
>> maybe there are vectorized ways to do certain stuff and special
>> implementations could be provided for such operations.
>>> > Another thing is to be able to do something like:
>>> > MatrixXf nn,xx;
>>> > xx = nn.colwise().normalized();
>>> Hm. If we start going that route, won't we end up offering every
>>> possible vector operation as colwise() operation?
>>> The biggest reason not to do it now is that it really isn't necessary
>>> to release Eigen 3.0 .... ;-)
>> Yes I understand. I think vector-wise operations could be very important,
>> but they would probably require a bit of discussion before implementing them
>> (ie: should we keep .colwise(), should we use .vectorwise(), in the case
>> some names for the reductions and vector-wise operations are the same). I
>> believe these kind of operations would be very useful since many times a set
>> of vectors is stored in a matrix, so this way the user could benefit from
>> extra performance.
> But the thing is, for this kind of use case, the much better approach
> is to use a matrix-of-arrays:
> Matrix<ArrayXf, 3, 1>
> then all operations are automatically supported and vectorized.
> The problem is that right now there is an inefficiency problem as we
> create useless temporaries with Scalar=Array. This is the problem that
> is the most worth fixing.
To Carlos: note that colwise() and rowwise() returns a VectorwiseOp
proxy and are not limited to reduction only... For instance there are
functions which are not about reduction at all: +, -, cross,
hnormalized (homogeneous normalization), etc.
>> I could try some prototypes to see if I can make it work. I designing
>> something like .vectorwise() could be fine.