Re: [eigen] port of (c)minpack to eigen : status

[ Thread Index | Date Index | More Archives ]

In data giovedì 24 settembre 2009 15:50:38, Hauke Heibel ha scritto:
> 1) I think it might be of interest for users to change the cost function at
> run-time. With the current design this is hardly possible since it requires
> to instantiate a new LevenbergMarquardt class for each cost-function.
> Pre-Instantiation of all possible optimizers is here a bit cumbersome. An
> alternative would be to define a pure virtual cost function base class.
> Actually, in our framework we have that even for optimizers since we also
> like to switch between let's say Downhill Simplex (or Nelder Mead) and
> LevenbergMarquardt. But that is really optional - though in our group we
> utilize it a lot.

Well, i dont really know (yet) about this one.

What is this framework you are speaking about ? 

> 2) void abort() would be nice in the optimizer's interface. There exist
> optimization procedures that take for longer than 5 minutes and here it
> makes a lot of sense to be able to terminate early. How this might be done
> a) Introduce a member: Status m_status;
> b) Replace each 'return Running;' by 'return m_status;'
> c) Replace each remaining 'return <status_type>' by 'm_status =
> <status_type>; return m_status;'
> d) Add 'void abort() { m_status = UserAsked; }

I did not used a m_status member on purpose, as i wanted the caller to handle 
this (in the case of *Init()/*OneStep() methods), let me think some more about 

> 3) It seems, as if currently different functions are called for numerical
> differentiation vs. optimal storage vs. analytical differentiation. I am
> just wondering whether these special cases are not candidates for
> templating. These are basically policies, right? Like 'use as much mem as
> you want' vs. 'use minimal memory' and 'differentiate analytically' vs.
> 'differentiate by forword diffs'.

Yes, definitely. I thought about this but i dont know how to do it exactly, 
but i really think we should do that.
Currently, all three 'variants' are indeed almost the same code. This was this 
way in cminpack, and i haven't factored it yet. Though i did pay attention so 
that the 'diff' between  variants is as small as possible, to ease factoring 
later on. If someone tells me the best way to do that using template arg, i 
can go forward.

> 4) Optional: A callback function would be cool. Basically, some way of
> getting updates of intermediate optimization results. It oftentimes helps a
> lot in the course of debugging/tuning specific optimization problems
>  because you can visualize the intermediate results.

There was something like this in minpack. There was an argument 'nprint' and 
every 'nprint' steps, some callback was called so that the user could do 
whatever he wans (display variables). The return value of the callback was 
also used to ask for the algorithm to stop. See my commit baaf3f4c39dd which 
removed it.
This is related to question 2 : there are two 'ways' of using the optimizers. 
One function that does it all by itself, and the *Init()/*OneStep() way where 
the loop is handled by the caller. If you want to print stuff and debug, you 
could use the second way. Just copy the loop from (for example) 
minimizeNumericalDiff() (it's four lines). You could also display from the 

> 5) Finally, I would not expose member variables directly and I am wondering
> whether all of those functions are required in the public interface.

yes, i agree, this need cleaning but can wait a little more. I'm more 
importantly concerned by the API right now.

Thomas Capricelli <orzel@xxxxxxxxxxxxxxx>

Mail converted by MHonArc 2.6.19+