Re: [eigen] Documentation : it's a sprint!! |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: Re: [eigen] Documentation : it's a sprint!!
- From: Gael Guennebaud <gael.guennebaud@xxxxxxxxx>
- Date: Wed, 7 Jul 2010 11:37:57 +0200
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:mime-version:received:in-reply-to :references:from:date:message-id:subject:to:content-type :content-transfer-encoding; bh=nixuts6rOHuUPjNgb7vq/YKvE04eb+3ii3zHCcJjyvc=; b=R2W0Ehj9OBx4muee4T5NeCM2K/5Jkag9+lijRSfa40TJW/sUeg0UjnNAskwM/zkhnv lZFKEpDwCwhgkki5xgamcqFBqegT9BiJ2R/ineYJIX/5fcmifK0nVbl5hWFSQVKf4aY3 5HQwxRl3xI67nvTQ/q/uHjr9vulu0wJ1EMv2I=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type:content-transfer-encoding; b=Jis24i/zIkDdVvQa+7hup4C3pcffGbxEAr5zR/1SGOD+S0Mci3f/Bxfhj2L7eejfRX 97z+K1TkIOrMCZlbMASosnbomrBkW+UONR/HbJyERvCmSLAhrSlEIDdEWq4bRA8uOhhF MIvqngAZBsc9AaOgzrC50AcKnb+DtB+A4UyzE=
On Wed, Jul 7, 2010 at 11:30 AM, Carlos Becker <carlosbecker@xxxxxxxxx> wrote:
> Ok, finally back to the reductions/visitors tutorial. I was thinking of
> including a nice example in the end, using broadcasting and partial
> reductions to find the nearest neighbour of a vector between all the columns
> in a given matrix, with something like:
> VectorXf v;
> MatrixXf m;
> int index;
> (m.colwise() - v).array().square().colwise().sum().minCoeff(&index);
why not, but here is a simpler version:
MatrixXf::Index index;
(m.colwise() - v).colwise().squaredNorm().minCoeff(&index)
note that you have to use MatrixXf::Index instead of int.
gael.
> I think it would give a good insight of Eigen and, since this is tutorial
> number 7, the user should already have a good background on Eigen. I could
> explain what is happening at each '.' in the previous expression to make it
> clear and show how powerful Eigen is.
> However, I know that this might be a bit advanced for a starters' tutorial,
> so I will wait for your thoughts on this.
> Cheers,
> Carlos
>
>
> On Thu, Jul 1, 2010 at 5:12 PM, Carlos Becker <carlosbecker@xxxxxxxxx>
> wrote:
>>
>> Thanks. It was to know what to put in each subsection since I made one for
>> visitors, one for reductions and one for broadcasting. So now it is clear.
>> Thanks
>>
>>
>> On Thu, Jul 1, 2010 at 3:10 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx>
>> wrote:
>>>
>>> 2010/7/1 Carlos Becker <carlosbecker@xxxxxxxxx>:
>>> > I thought that with matrix.colwise().sum(), colwise() is treated as a
>>> > visitor, and with += it is a broadcasting operation. This is because I
>>> > read
>>> > this on your tutorial proposal:
>>> > 7. Reductions, visitors, and broadcasting
>>> > - for all kinds of matrices/arrays
>>> > - sum() etc...
>>> > - mention partial reductions with .colwise()...
>>> > - mention broadcasting e.g. m.colwise() += vector;
>>> > I thought that partial reductions were done with visitors and
>>> > broadcasting
>>> > was something similar but with 'write access'. Or maybe both are
>>> > visitors, I
>>> > guess I am missing many concepts here.
>>>
>>> computing a sum is a reduction (or partial reduction) not a visitor.
>>>
>>> a visitor is when you want to find, as output, a location inside of a
>>> matrix.
>>>
>>> For example, this is a reduction:
>>>
>>> result = matrix.maxCoeff();
>>>
>>> this is a visitor:
>>>
>>> Index i, j;
>>> matrix.maxCoeff(&i, &j);
>>>
>>> Anyway, don't bother learning a new area of Eigen just to write docs
>>> about it :-)
>>>
>>> Benoit
>>>
>>>
>>>
>>>
>>> >
>>> >
>>> > On Thu, Jul 1, 2010 at 2:46 PM, Benoit Jacob <jacob.benoit.1@xxxxxxxxx>
>>> > wrote:
>>> >>
>>> >> Broadcasting means e.g.
>>> >>
>>> >> matrix.colwise() += vector;
>>> >>
>>> >> Visitors are e.g.
>>> >>
>>> >> Index i, j;
>>> >> matrix.maxCoeff(&i, &j);
>>> >>
>>> >> These are two really different things, no?
>>> >> Benoit
>>> >>
>>> >> 2010/7/1 Carlos Becker <carlosbecker@xxxxxxxxx>:
>>> >> > Just a quick question: what would be the difference between visitors
>>> >> > and
>>> >> > broadcasting? Seems to me that broadcasting is able to 'visit'
>>> >> > column or
>>> >> > row-wise, also modifying the data inside the matrix/array object, am
>>> >> > I
>>> >> > right? I haven't used broadcasting with eigen before.
>>> >> >
>>> >> >
>>> >> >
>>> >> > On Wed, Jun 30, 2010 at 1:30 PM, Carlos Becker
>>> >> > <carlosbecker@xxxxxxxxx>
>>> >> > wrote:
>>> >> >>
>>> >> >> oh ok sorry, got confused since I thought that someone was alreay
>>> >> >> writing
>>> >> >> the sparse tut
>>> >> >>
>>> >> >>
>>> >> >> On Wed, Jun 30, 2010 at 1:29 PM, Gael Guennebaud
>>> >> >> <gael.guennebaud@xxxxxxxxx> wrote:
>>> >> >>>
>>> >> >>> nevermind, this C07_TutorialSparse.dox file is an old one...
>>> >> >>>
>>> >> >>> gael
>>> >> >>>
>>> >> >>> On Wed, Jun 30, 2010 at 2:26 PM, Carlos Becker
>>> >> >>> <carlosbecker@xxxxxxxxx>
>>> >> >>> wrote:
>>> >> >>> > I am starting with the reductions/visitors/broadcasting tutorial
>>> >> >>> > and
>>> >> >>> > just
>>> >> >>> > noticed that the sparse tutorial is named as
>>> >> >>> > C07_TutorialSparse.dox.
>>> >> >>> > According to the order
>>> >> >>> > in http://eigen.tuxfamily.org/dox-devel/ it
>>> >> >>> > should be
>>> >> >>> > C08 and C07 is to be the one I am doing. This is a silly
>>> >> >>> > question
>>> >> >>> > but
>>> >> >>> > just
>>> >> >>> > wanted to make sure we are all following the same conventions.
>>> >> >>> > Carlos
>>> >> >>> >
>>> >> >>> > On Sun, Jun 27, 2010 at 1:59 PM, Gael Guennebaud
>>> >> >>> > <gael.guennebaud@xxxxxxxxx>
>>> >> >>> > wrote:
>>> >> >>> >>
>>> >> >>> >> On Sun, Jun 27, 2010 at 12:41 PM, Carlos Becker
>>> >> >>> >> <carlosbecker@xxxxxxxxx>
>>> >> >>> >> wrote:
>>> >> >>> >> > Mmm I am trying to think of a straightforward explanation for
>>> >> >>> >> > this.
>>> >> >>> >> > What
>>> >> >>> >> > do
>>> >> >>> >> > you think about calling them fixed-size and dynamic-size
>>> >> >>> >> > blocks,
>>> >> >>> >> > where
>>> >> >>> >> > the
>>> >> >>> >> > former differs from the later because its size is known at
>>> >> >>> >> > compile-time.
>>> >> >>> >>
>>> >> >>> >> a slightly more precise variant: "the latter are optimized
>>> >> >>> >> versions
>>> >> >>> >> of
>>> >> >>> >> the former when the size is known at compile-time." I just
>>> >> >>> >> added
>>> >> >>> >> the
>>> >> >>> >> word "optimized"
>>> >> >>> >>
>>> >> >>> >> you might also have a look at the reference tables, section
>>> >> >>> >> "Sub
>>> >> >>> >> matrices" to see how they are presented. The old version of
>>> >> >>> >> this
>>> >> >>> >> section is available online there:
>>> >> >>> >>
>>> >> >>> >>
>>> >> >>> >>
>>> >> >>> >>
>>> >> >>> >>
>>> >> >>> >> http://eigen.tuxfamily.org/dox-devel/TutorialCore.html#TutorialCoreMatrixBlocks
>>> >> >>> >>
>>> >> >>> >> gael
>>> >> >>> >>
>>> >> >>> >> >
>>> >> >>> >> >
>>> >> >>> >> > On Sun, Jun 27, 2010 at 11:02 AM, Carlos Becker
>>> >> >>> >> > <carlosbecker@xxxxxxxxx>
>>> >> >>> >> > wrote:
>>> >> >>> >> >>
>>> >> >>> >> >> Yes, I got that and actually it was my mistake since I
>>> >> >>> >> >> supposed
>>> >> >>> >> >> that it
>>> >> >>> >> >> was only for fixed-size matrices, so now I am changing it.
>>> >> >>> >> >> Thanks,
>>> >> >>> >> >>
>>> >> >>> >> >>
>>> >> >>> >> >> 2010/6/27 Björn Piltz <bjornpiltz@xxxxxxxxxxxxxx>
>>> >> >>> >> >>>
>>> >> >>> >> >>> "The following tables show a summary of Eigen's block
>>> >> >>> >> >>> operations
>>> >> >>> >> >>> and
>>> >> >>> >> >>> how
>>> >> >>> >> >>> they are applied to fixed- and dynamic-sized Eigen
>>> >> >>> >> >>> objects."
>>> >> >>> >> >>> This quote and the following table gives the impression
>>> >> >>> >> >>> that
>>> >> >>> >> >>> the
>>> >> >>> >> >>> fixed
>>> >> >>> >> >>> size functions are only available for fixed size matrices..
>>> >> >>> >> >>> But
>>> >> >>> >> >>> using
>>> >> >>> >> >>> fixed
>>> >> >>> >> >>> size vs dynamic size functions acually only determin the
>>> >> >>> >> >>> return
>>> >> >>> >> >>> type.
>>> >> >>> >> >>> Björn
>>> >> >>> >> >
>>> >> >>> >> >
>>> >> >>> >>
>>> >> >>> >>
>>> >> >>> >
>>> >> >>> >
>>> >> >>>
>>> >> >>>
>>> >> >>
>>> >> >
>>> >> >
>>> >>
>>> >>
>>> >
>>> >
>>>
>>>
>>
>
>