|Re: [eigen] ISO C++ working groups on Linear Algebra / Machine Learning|
[ Thread Index |
| More lists.tuxfamily.org/eigen Archives
On 15/02/2019 12.33, Patrik Huber wrote:
I'm glad that the information was helpful to you and some others.
I am actually firmly in the camp of "operator* as matrix-matrix product",
like in Eigen. I personally don't like numpy's "@" syntax, as I really
can't associate multiplication with that sign, but that's just personal. On
a more rational level, I don't think we would (ever?) get the @-sign in C++
for matrix-matrix multiplication, so we would be stuck with having to use a
mul/dot function - basically the numpy approach but without having "@".
Unless C++2z/3x will introduce new operators, we are limited to the
overloadable existing operators:
That really would not be great. I just would not want to write expressions
A = B.mul(C.mul(D)); or (slightly better perhaps, but...) A = mul(B, C, D);.
Like you, I also mainly use matrix-matrix and matrix-vector
multiplications, and when I rarely do need component-wise, it results in
much more clear code to me to use .array() or a special function in that
Generally, I don't think we need to change our syntax just because some
people prefer a different syntax. To me our Matrix vs Array distinction
makes perfect sense. I would not object adding some functionality to
make this somehow shorter
ArrayXXd A, B; MatrixXd C = A.matrix()*B.matrix();
But that is a feature I personally would rarely need.
I do like Matlab's approach of ".*" for component-wise multiplication, as I
can associate something related to "multiplication" with it. But I don't
think there's a way this could happen in C++.
`.*` is an operator, but not overloadable.
The only real argument that I can see for using * for component-wise, and a
function for matrix-matrix, is perhaps higher-dimensional tensor
multiplications, where the *-sign might be ambiguous or might not make too
much sense, but I am not too familiar with that field.
For higher-dimensional tensors the Einstein notation may be much better:
And I could agree that otherwise Tensors should behave more like
multi-dimensional Arrays. I think we recently had a complaint/suggestion
that naming them `Tensor` is confusing for how they are currently used
(need to find that mail).
It might be a bit of bikeshedding about a technicality, but if more people
feel like it should be "the Eigen way", now would probably be the time to
give input to the SGs and discuss that further. Do you (or anyone else)
think it would make sense to open a thread there? I'd be happy to do so.
I did not subscribe to that SG. I would encourage someone to at least
make public that there are people who are perfectly fine with how Eigen
uses operator* as Matrix-Matrix/Matrix-Vector multiplication. Just in
case they assume because they like it, everyone likes it (I don't mind
if other people prefer different syntax).
One argument where I kind-of agree that matrix-matrix products are
"different" is that these are one of the few operations that can easily
alias and where it is not easy to make an in-place operation. This is
also one of the few operations where Eigen silently introduces
temporaries, if one does not pay attention.
On Thu, 14 Feb 2019 at 18:03, Gael Guennebaud <gael.guennebaud@xxxxxxxxx>
On Wed, Feb 13, 2019 at 6:19 PM Patrik Huber <patrikhuber@xxxxxxxxx>
I recently found out that the C++ standardisation committee now created a
Special Interest Group (SIG) for Linear Algebra within SG14 (the study
group for game-dev & low-latency), and that there's also a new study group
on machine learning, SG19.
Both groups have been formed within the last few months, and I don't
think there was too much noise around it, so I thought it might be quite
interesting for some people on the Eigen list. I also just joined their
Thanks a lot for these informations! I joined both SG14 and SG19 lists too.
and I didn't recognise any familiar name for the Eigen list so far.
There is Matthieu Brucher who is a member of this list and posted here a
On a first glance, I saw that they seem to make a few design decisions
that are different from Eigen (e.g. operator* is only for scalar
multiplications; or there are separate row/col_vector classes currently).
Regarding operator*, from their discussions we can already see clear
disagreements between "linear algebra" people and more general "data
scientists"... Some cannot stand operator* as being a matrix-matrix product
and are basically seeking for a Numpy on steroid. Personally, as I mostly
do linear algebra I almost never use the component-wise product and I'd
have a hard time giving up operator* for matrix-matrix products. On the
other hand, I found myself using .array() frequently for scalar addition,
abs, min/max and comparisons... and I've to admit that our
.array()/.matrix() approach is not ideal in this respect.
Nevertheless, following the idea of a "numpy on steroid", if that's really
what developers want, we might thought about making our "array world" world
more friendly with the linear-algebra world by:
- adding a prod(,) (or dot(,)) function
- moving more MatrixBase functions to DenseBase (most of them could except
- allowing Array<> as input of decompositions
- enabling "safe" binary operations between Array and Matrix (and
returning an Array)
This way people who don't want operator as a matrix product, or with a
strong experience of numpy, could simply forget about
Matrix<>/.matrix()/.array() and exclusively use Array<>. Then time will
tell us if, as with numpy::matrix vs numpy::array, everybody will give up
about Matrix<>... (I strongly doubt).
I think it would be really great to get some people from Eigen (us!)
aboard that process.
Here are some links:
SG14 mailing list: http://lists.isocpp.org/sg14/
SG19 mailing list:
There are two repos where they started mapping out ideas/code/paper:
Dr. Patrik Huber
Founder & CEO 4dface Ltd
3D face models & personal 3D face avatars for professional applications
Dr.-Ing. Christoph Hertzberg
Besuchsadresse der Nebengeschäftsstelle:
Robotics Innovation Center
28359 Bremen, Germany
Postadresse der Hauptgeschäftsstelle Standort Bremen:
Robotics Innovation Center
28359 Bremen, Germany
Tel.: +49 421 178 45-4021
Zentrale: +49 421 178 45-0
Weitere Informationen: http://www.dfki.de/robotik
Deutsches Forschungszentrum für Künstliche Intelligenz GmbH
Trippstadter Strasse 122, D-67663 Kaiserslautern, Germany
Prof. Dr. Jana Koehler (Vorsitzende)
Dr. Walter Olthoff
Vorsitzender des Aufsichtsrats:
Prof. Dr. h.c. Hans A. Aukes
Amtsgericht Kaiserslautern, HRB 2313