Re: [eigen] Eigenvalues of (lower) Hessenberg matrix |

[ Thread Index | Date Index | More lists.tuxfamily.org/eigen Archives ]

*To*: eigen@xxxxxxxxxxxxxxxxxxx*Subject*: Re: [eigen] Eigenvalues of (lower) Hessenberg matrix*From*: Ian Bell <ian.h.bell@xxxxxxxxx>*Date*: Fri, 7 Apr 2017 18:40:27 -0600*Dkim-signature*: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=BoS7x2SoE3D24LOvfCfDwD9h2Gu2TTQf0+nEFqgPvh0=; b=P38RIRfV9kSFPVZibYyzmNA/uEL/61Q7jqF1DhpfUTFBOIgh1yuodgnDqUwGFltjue aPchIlx9ZzZG243izsGdM41MpZi/rc+lrtmqYu/SeTaT98PIzQcVYCiYv9xk1OPHAzeV QAAOF+1In4CbnMDmA96dvTuw4eWIUOYsDM61AryQGB2dKiL7NlZRJnCR2qV42L0WqEzL jzqO2jEEk9dl8OdTAXFa85LR5jGS4emWyWQolNh1v8HrVw50hRM/vi5LE3DFd1sY2ZMx 01fyq8vzIjNRhlpppWrqJDmBJ9KKLsY8iPkLLTQeCYTJksV0LxQPSEHHrcKmv33sBTIl aniQ==

Ok, so I got that mostly working (I can find all the real roots), but it also gives me a bunch of other roots in addition to the real roots that I am looking for; I believe they are arriving from the complex roots - how do I kick out the roots that would be associated with the complex roots normally? The code I used is

Eigen::RealSchur<Eigen::MatrixXd> schur;

schur.computeFromHessenberg(Abalanced.transpose(), Eigen::MatrixXd::Zero(Abalanced.rows(), Abalanced.cols()), false);

Eigen::VectorXd eigs = schur.matrixT().diagonal();

I haven't done a speed benchmark, but I can will do so once I figure out how to keep the roots I want.

Kind Regards,

Ian

On Fri, Apr 7, 2017 at 8:35 AM, Ian Bell <ian..h.bell@xxxxxxxxx> wrote:

Gael,I'm afraid that a patch is probably not forthcoming from me, sadly, as I don't know the guts of how eigenvalue solver operates. On the other hand, I could contribute my matrix balancing routine, without which the eigenvalue solver is totally useless on my problem.For computeFromHessenberg, do you have to operate on fixed size matrices? I gave theexampleof the use of 16 x 16 matrix, but in general they are dynamically sized.IanOn Thu, Apr 6, 2017 at 2:00 PM, Gael Guennebaud <gael.guennebaud@xxxxxxxxx> wrote:Hi,I really though EigenSolver had a computeFromHessenberg shortcuts, but looks like it is still missing though this would be easy to add (similar to SelfAdjointEigenSolver::computeFromTridiagonal, that one does exist!). Patch welcome. In the meantime, since you only care about the eigenvalues and if they are real, then you can use RealSchur::computeFromHessenberg(H, Matrix<double,16,16>::Zero(), false) and then RealSchur::matrixT().diagonal( ) to get the real eigenvalues. If they are not real, then implementing EigenSolver::computeFromHessen berg based on EigenSolver::compute() and RealSchur::computeFromHessenbe rg() would be the easiest. gaelOn Thu, Apr 6, 2017 at 4:16 PM, Ian Bell <ian.h.bell@xxxxxxxxx> wrote:Nope, using dynamic sized arrays; for 16x16, Eigen is about twice as slow as LAPACK routine tuned for Hessenberg matrices.On Wed, Apr 5, 2017 at 1:12 AM, Julian Kent <jkflying@xxxxxxxxx> wrote:JulianHi Ian,Just to check, are you using fixed-size matrices? This might let the compiler do considerably more in terms of loop unrolling and SIMD, and could account for the factor of 2 difference you see with LAPACK.

CheersOn 5 April 2017 at 00:07, Ian Bell <ian.h.bell@xxxxxxxxx> wrote:Kind Regards, and thanks for your help,For future reference, how does one "select "computeEigenvectors = false" in EigenSolve"?For these sizes, there just isn't much more to do in Eigen-based eigenvalue solving. The solver in LAPACK for Hessenberg matrices is about two times (very roughly) faster for 16 x 16 matrices. So this is not the end, I guess, but it might be the end of my capabilities in the world of linear algebra.Yixuan,For sure I took many replicates - the timings were actually carried out in a python library built with pybind11 on top of your code, Eigen, and some of my own code. The library is actually for working with Chebyshev expansions of continuous functions and doing rootfinding with them. I'm relatively familiar with the pitfalls of profiling, so I think this is a pretty fair test.

That's certainly much better - on par at these sizes now with the naive eigenvalues() function in Eigen. I guess if you are focused on larger matrices (as most people seem to be), my matrices are rather tiny, and perhaps, less interesting.IanOn Tue, Apr 4, 2017 at 2:47 PM, Yixuan Qiu <yixuanq@xxxxxxxxx> wrote:YixuanHi Ian,Given that your matrices are at very small scale, I don't expect my class will have visible improvement. Actually you found it to be slower, which I think may be due to several reasons:1. My class does some basic scaling of matrix before factorization.2. Did you select "computeEigenvectors = false" in EigenSolver? My class assumes computing eigenvectors, so some calculations are unneeded in your case. You can try the reduced version attached.3. Did you randomize the order of execution? Did you have enough replications of experiment? Was the difference statistically significant relative to the standard error? Personally I don't fully trust benchmark results measured by elapsed time at the scale of microseconds. I would recommend Callgrind as the profiler for such tasks.Hope this helps.Best,2017-04-04 15:40 GMT-04:00 Ian Bell <ian.h.bell@xxxxxxxxx>:Sadly, it seems to be consistently slower than the naive eigenvalues method of the MatrixXd class (see attached). Any idea why? Have you done any performance benchmarking with the UpperHessenberg code?On Mon, Apr 3, 2017 at 8:16 PM, Ian Bell <ian.h.bell@xxxxxxxxx> wrote:Amazing! I'll try that tomorrow at work. From your testing, how long approximately would it take to find the eigenvalues of a 10x10 Hessenberg matrix?On Mon, Apr 3, 2017 at 8:00 PM, Yixuan Qiu <yixuanq@xxxxxxxxx> wrote:YixuanBest,Eigen implements the upper Hessenberg eigen solver internally in the EigenSolver class.

I have extracted the relevant code and created an independent class in my Spectra library, so probably this is what you want: https://github.com/yixuan/spectra/blob/master/include/LinAlg /UpperHessenbergEigen.h You can drop the namespace declaration if you want.The usage of this class is similar to other eigen solvers in Eigen: https://github..com/yixuan/spectra/blob/master/test/Eigen.cpp #L27-L29 2017-04-03 20:59 GMT-04:00 Ian Bell <ian.h.bell@xxxxxxxxx>:I have a matrix, that by its construction is known to be Hessenberg (rigorously, lower Hessenberg, but it doesn't matter because the transpose of a matrix has the same eigenvalues as the original matrix and all I care about is eigenvalues). Is there any magic trick in Eigen that allows for more efficient evaluation of eigenvalues? The standard method eigenvalues() doesn't seem to do anything too smart about checking for the Hessenberg-ness of the matrix. Lapack has the function dhseqr, is there anything similar in Eigen?Ian

--

--

**Follow-Ups**:**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**References**:**[eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Yixuan Qiu

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Yixuan Qiu

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Julian Kent

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Gael Guennebaud

**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix***From:*Ian Bell

**Messages sorted by:**[ date | thread ]- Prev by Date:
**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix** - Next by Date:
**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix** - Previous by thread:
**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix** - Next by thread:
**Re: [eigen] Eigenvalues of (lower) Hessenberg matrix**

Mail converted by MHonArc 2.6.19+ | http://listengine.tuxfamily.org/ |