Re: [eigen] port of (c)minpack to eigen : status |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: Re: [eigen] port of (c)minpack to eigen : status
- From: Tom Vercauteren <tom.vercauteren@xxxxxxxxx>
- Date: Mon, 31 Aug 2009 10:27:54 +0200
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:reply-to:in-reply-to :references:from:date:message-id:subject:to:content-type; bh=89HGrgdGql7ZBDgIUd23qZh8XHCiekK9RJz0btPxqVU=; b=VXQSnSsaGmKm30X3PjxrePLd+b5Q3etA4AOcga6DjUXxFuCpTgZ1JX19b8TO8t4yGI 41oW45WKYt2Ke64sm14fmQzm3OelafA0aUZqhXAoaaWknJAEZRcjrAQ61LhKBaDSDFgg nfZAVzGEJst+yBX2O6E3dR9wsm4swbT1OWjuo=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:reply-to:in-reply-to:references:from:date:message-id :subject:to:content-type; b=vIoUu++gGvcQmxF5rqF3AoaQY+f13oBeVNxXoeOqHD8izAvztk03fHlzKakU48aNU6 Pvd6d6bt+WyjOyE79c2De30zbQUZpbB1cYUld2M4Tbn0iaIqZS/xmsWXz9pkM2tyC7he AfS0vYrWPJm144YkxYcj3v+BvoHDv2HT+Ktpo=
Hi Thomas,
I would be very much interested in having such a generic least-squares
implementation.
On Sun, Aug 30, 2009 at 22:30, Thomas Capricelli<orzel@xxxxxxxxxxxxxxx> wrote:
[snip]
> Few words about the API
> All algorithms can use either the jacobian (provided by the user) or compute
> an approximation by themselves (using forward-difference). The part of API
> referring to the latter use 'NumericalDiff' in the method name (exemple
> :LevenbergMarquardt.minimizeNumericalDiff() )
Would it also be possible to provide only the pseudo-Hessian and the
product of the Jacobian and the residual vector?
This would be very helpful to deal with problems such as image
registration where the Jacobian matrix is very skinny, typically 10^8
by 12.
The least squares problem basically boils down to solving
( J^T . J ) . h = - J^T . f
where J is the jacobian, h is the update and f is the resudual vector.
In the image registration use case the Jacobian is really big but the
pseudo-Hessian
H = ( J^T . J )
and the product of the Jacobian with the residual vector
J^T . f
are tiny
H : 12 by 12
J^T . f : 12 by 1
Storing only these variables makes the memory consumption much lower
and speeds up the algorithm. This is however at the cost of loosing
some numerical accuracy. Indeed, one does not rely on the QR
decomposition of J anymore but typically relies on a Cholesky
decomposition of H.
[snip]
> Final remarks:
[snip]
> * there is a method called 'dogleg', and while i'm not very aware of what this
> is, i think i've understood that it has a broader scope than just the Hybrid
> algorithm, so maybe we should make it available in a more general way in
> eigen. Do anybody know better than me about this ?
The dog-leg approach is nice because it only requires to solve one
linear system per iteration as opposed to Levenberg-Marquardt.
Below is a very nice tutorial on non-linear least squares. Many
algorithms including Powell's dog-leg are given in pseudo code:
Madsen, K., Nielsen, H. B., Tingleff, O., Methods for Non-Linear Least
Squares Problems (2nd ed.), pp. 60, 2004
http://www2.imm.dtu.dk/pubdb/views/edoc_download.php/3215/pdf/imm3215.pdf
As a companion to this paper, there is a matlab toolbox:
http://www2.imm.dtu.dk/~hbn/immoptibox/
http://www2.imm.dtu.dk/~hbn/immoptibox/lstsq.html#TOP
Hope this helps,
Tom Vercauteren