Re: [eigen] LeastSquares, Pseudo Inverse & Geometry |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
That could be a difference, but i cant find information about using eigenvalue problems to fit hyperplanes....
On wikipedia, none of total/ordinary LS page mentions eigenvalue methods, but
http://en.wikipedia.org/wiki/Simple_linear_regression at least says "The fitted line has the slope equal to the correlation between y and x corrected by the ratio of standard deviations of these variables. The intercept of the fitted line is such that it passes through the center of mass (x, y) of the data points.", which looks very similar to what our current "LeastSquares" does.
So this would be the same as what you call OLS and what the general least square i mentionned would do... ?
Benoit, according to the log, you're the one who did that.... any thought ? :0)
++
Thomas
--
Thomas Capricelli <orzel@xxxxxxxxxxxxxxx>
http://www.freehackers.org/thomas
On Monday 25 January 2010 11:08:58 Jitse Niesen wrote:
> I seem to remember that the LeastSquares module does total least squares,
> and not ordinary least squares. At least, that is what I claimed in
> http://listengine.tuxfamily.org/lists.tuxfamily.org/eigen/2009/05/msg00158.html
>
> In two dimensions, both ordinary least squares (OLS) and total least
> squares (TLS) fit a line y = ax+b through some data (x_i,y_i). OLS chooses
> the line such that sum_i (ax_i+b-y_i)^2 is minimized; that is the sum of
> the squares of the distance along the y-direction between the data points
> and the line. TLS minimizes the sum of the squares of the distance between
> the data points and the line perpendicular to the line.