Re: [eigen] banded matrices in Eigen |
[ Thread Index |
Date Index
| More lists.tuxfamily.org/eigen Archives
]
- To: eigen@xxxxxxxxxxxxxxxxxxx
- Subject: Re: [eigen] banded matrices in Eigen
- From: Laura Flores <laura.floresanchez@xxxxxxxxx>
- Date: Thu, 13 Feb 2014 12:26:37 +0100
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=B9SgnVhfYviTV5ssJBvNFkgrS/uvDssVILsntcze6PY=; b=S3bouChZLPyyRSH0es9US3lYixPEeuENeXnS38gJQhPe9D8acDV4t3muCn9oN3cLMi 6g4Ei9FfY2HHhXawklt1/LNsNc1W9EOzVe94Q1Dk1GUCrGiliZ+Jiu63I51dEec+VUsk ZYuNKgbD9rCjN5o5PiS20OLzc/X44VCd7zXRDXRViNPhMpIqse5+lQAzqr6iI6r08I0y BcazNLGP7aMaY2FhT91XKQ+Hw2rbHPbGR9AsaZnGoHV1WSRVjXRrzpVPblDIk/3SSQTy flCcWG1EawOsgiuZ1lF6JzMSZejChijhedVvza5i4j7WEEeCMjywYl3oO9g1RVmLdxIJ OJjA==
Hi Gael,
Apologies for the confusion. The ThreadedConjugateSolver is a personal derived version from the Conjugate Gradient Solver class.
Concerning the matrices, the problem is that I am dealing with fully complete secondary diagonals, and implementing this with Sparse matrices implies a memory overhead that I would like to avoid.
BandMatrix is an approach which is very close to what I am looking for, but the problem is, as far as I have understood, that it would store all the super/sub diagonals up to a certain given number.
In my case, this is just not feasible. As an example, I would have a square matrix of size 9e6, in which I would only need to store 4 diagonals: the main one, and those with offsets 1, 300 and 90000 for instance.
(In the general case, I would always have an uppermost diagonal which is considerably far away from the rest.) Therefore, storing all the intermediate diagonals, which in this case would be full of zeros, is not possible.
If I could create my own class, derived from one of the existing ones (possibly the Diagonal one?), I could add some methods that suit the problem I am working on. Therefore, my question would be whether there are any suggestions
for that.
Thank you,
Laura