Re: [eigen] LeastSquares, Pseudo Inverse & Geometry |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: Re: [eigen] LeastSquares, Pseudo Inverse & Geometry
- From: Manuel Yguel <manuel.yguel@xxxxxxxxx>
- Date: Mon, 25 Jan 2010 09:51:24 +0100
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:mime-version:received:in-reply-to:references :from:date:message-id:subject:to:content-type :content-transfer-encoding; bh=5dwFCF2KT2g9rMGJaYwkKP9uc34GVieMxTxVXWkUfIU=; b=bSJ9RFm5NZiLoXSzxWX3Ct9ka/W749vP9z73OgYMgvu7t7QwDiTTN/kga1fMtgCT3G LVxpmsGaQvqnzGn5sG9Xg+YEo4jea1v+AzjnwyVuLyHolzge9IHSoBwzv290EO/knnSl 8eN4QsQy2VpPgZ5nn26CAL18nSztY5OiEoT/c=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type:content-transfer-encoding; b=uMiXqEDt0UuaqEQCtMJ09vm4NINYwLNuKQNlcTUXtJXMW82CAeop0sV/L8AXceszYn 6gAsu0Vnupwx6U16ApjRCe675bOjAl/VOlbF5uooAfgpG1FlGLitymoESGK8BEzQ+++R rVXPmu8qRbvULXxrTOISwx1tnOmg7osIGAV6w=
On Mon, Jan 25, 2010 at 9:13 AM, Thomas Capricelli
<orzel@xxxxxxxxxxxxxxx> wrote:
>
> Hi all,
>
> I feel uncomfortable with the name of the module "LeastSquares"
> http://eigen.tuxfamily.org/dox-devel/group__LeastSquares__Module.html
>
> It's nothing more than one function (and one variant) computing linear regression, which is I think a very special case of least squares. The API has a 'geometry' feeling more than an optimization or statistical one. (that's what i'm interested about, you'll have guessed).
>
> What the (currently still unsupported) NonLinearOptimization module does is also least squares, in the non linear case.
>
> Though even in the linear case, this is different from http://en.wikipedia.org/wiki/Least_squares, which also is something that I need. I dont really understand how far it's
> different, but at least the computation is different : the current LeastSquares module uses eigenvalue, while this article (and the stuff I do) uses pseudo-inverse.
Hello, I think that this is because the covariance matrix is symmetric
semi-definite positive, therefore, eigenvalues and singular value
decomposition look the same: U = V in the singular value decomposition
UDV^t:
- you can diagonalize the covariance matrix in an orthonormal basis: C = ODO^t,
- by symmetry it implies that O=U=V
By the way I just wanted to know if this knowledge makes the
computation faster, i.e. is the eigenvalues/eigenvectors computation
faster than the SVD in this case.
I do not know what is the complexity of both algorithms, but I would
be happy to make some performance comparisons.
- cheers,
Manuel
>
> Concerning the pseudo inverse, i think we need an implementation in eigen, and I'm happy Benoit has this planned.
>
> So my questions are:
> * what do you think of moving those two methods to the Geomerty module, and removing the LeastSquares module ?
> * should we provide methods for linear least squares (which is really nothing more than computing+applying a pseudo inverse ?)
>
> ++
> Thomas
> --
> Thomas Capricelli <orzel@xxxxxxxxxxxxxxx>
> http://www.freehackers.org/thomas
>
>
>