|[eigen] port of (c)minpack to eigen : status|
[ Thread Index |
| More lists.tuxfamily.org/eigen Archives
Some of you may be aware that i'm working on integrating cminpack into eigen.
* Minpack is a very famous, old, robust and well-re-known package, written in
fortran, that implements two algorithms for non-linear optimization :
levenberg-marquardt and 'hybrid'. See http://en.wikipedia.org/wiki/MINPACK
* cminpack is the result of using f2c (fortran to c converter) and applying
several 'cleaning' pass over it. The last one of those cleaning being my
starting point: http://devernay.free.fr/hacks/cminpack.html
So i've ported this to eigen, and you can find the current code on bitbucket :
The purpose of this mail is to describe what you'll find in this fork, and to
ask for feedback.
I'm especially interested in two kinds of feedback:
* first, i'm sure several people here use some variation of those algorithms,
either home-made or using 3rd party packages non integrated in eigen. I'd like
to know if they can make some use of this code, if this is easy enough to use
and if the results are ok for them (regarding both accuracy and speed)
* secondly, i'm really not familiar with 'advanced' c++ and things like
templating and functors, and I would like to get advices about how to
integrate better with eigen.
What i'm NOT asking feedback about are the internals : this is work in
progress, i know lot of things can still be done. Most of the ugly parts come
from the f2c translation (aka "not my fault"). Typically you'll find some
'goto' left in subroutines. Thanks to unit tests, i can work on those later
on. Yes, i know the code still uses its own implementation of QR factorization
and yes, it is planned to use the one from eigen instead. Though this is not
So, what will you find in this fork ?
* in unsupported/Eigen/src/NonLinear/ : this is (c)minpack ported to eigen.
You should only consider using/reading the files LevenbergMarquardt.h and
HybridNonLinearSolver.h which define the current API. Other files are
subroutines used by those.
* in unsupported/test/NonLinear.cpp : unit tests for the port. There are two
kinds of tests : those that come from examples bundled with cminpack. They
guaranty we get the same results as the original algorithms (value for 'x',
number of evaluation of the function, and number of evaluation of the jacobian
if ever). Other tests were added by myself at the very beginning of the
process and check the results for levenberg-marquardt using the reference data
on http://www.itl.nist.gov/div898/strd/nls/nls_main.shtml. Since then i've
carefully checked that the same results were obtained when modifiying the
code. We dont always get exactly the same results, but this is ok : they use
128bits float, and i do the tests using 'double'. I've done those tests on
several other implementation of levenberg-marquardt, and (c)minpack perform
VERY well compared to those, both in accuracy and speed.
As a matter of fact, those tests are also, currently, the best documentation
of how to use the "NonLinear" module.
Few words about the API
All algorithms can use either the jacobian (provided by the user) or compute
an approximation by themselves (using forward-difference). The part of API
referring to the latter use 'NumericalDiff' in the method name (exemple
The methods LevenbergMarquardt.lmder1()/lmdif1()/lmstr1() and
HybridNonLinearSolver.hybrj1()/hybrd1() are specific methods from the original
minpack package that you probably should use only if you port a code that was
previously using minpack. They just define a 'simple' API with default values
for some parameters.
All algorithms are provided using Two APIs :
* one where you init the algorithm, and use '*OneStep()' as much as you want :
this way the caller have control over the steps
* one where you just call a method (optimize() or solve()) which will
basically do exactly the same : init + loop until a stop condition is met.
Those are provided for convenience.
As an example, the method LevenbergMarquardt.minimizeNumericalDiff() is
implemented as follow :
LevenbergMarquardt.minimizeNumericalDiff(Matrix< Scalar, Dynamic, 1 > &x,
const int mode )
Status status = minimizeNumericalDiffInit(x, mode);
status = minimizeNumericalDiffOneStep(x, mode);
* If you can think of a better name than "NonLinear", i'm all for it. I'd like
to avoid the use of 'minpack' though.
* I will be away (as in "not even access to internet") until september 7th,
but feel free to spam my mailbox and/or the eigen mailing list, i'll read
everything thoroughly when i'm back.
* to run the tests, follow the documentation on
To only run the "nonlinear' tests do from build/unsupported/ : "make &&
* (c)minpack had an implementation of "blue norm". I've replaced those by
either blueNorm() or stableNorm(). The reason is that in some places, using
"blueNorm()' raised an assert() about overflow. To reproduce, replace all
stableNorm() by blueNorm() in unsupported/Eigen/src/NonLinear/* and start the
* there is a method called 'dogleg', and while i'm not very aware of what this
is, i think i've understood that it has a broader scope than just the Hybrid
algorithm, so maybe we should make it available in a more general way in
eigen. Do anybody know better than me about this ?
Thomas Capricelli <orzel@xxxxxxxxxxxxxxx>