2015
DOI: 10.1016/j.cam.2015.04.019
|View full text |Cite
|
Sign up to set email alerts
|

Jacobi–Davidson methods for polynomial two-parameter eigenvalue problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 27 publications
0
9
0
Order By: Relevance
“…Subspace methods for one-parameter eigenvalue problems are based around generating a series of linear spaces that eventually approximate one of the system's eigenspaces (the linear space of eigenvectors corresponding to a given eigenvalue). The Jacobi-Davidson and Rayleigh-Ritz methods are well-known one-parameter subspace methods, which can be generalized to apply to two-parameter systems [26,32,[49][50][51], though this is not without difficulties [49]. These methods do not invoke the operator determinants, and show potential for generalization both to 𝑁-parameter and to polynomial systems.…”
Section: Alternative Solution Methodsmentioning
confidence: 99%
“…Subspace methods for one-parameter eigenvalue problems are based around generating a series of linear spaces that eventually approximate one of the system's eigenspaces (the linear space of eigenvectors corresponding to a given eigenvalue). The Jacobi-Davidson and Rayleigh-Ritz methods are well-known one-parameter subspace methods, which can be generalized to apply to two-parameter systems [26,32,[49][50][51], though this is not without difficulties [49]. These methods do not invoke the operator determinants, and show potential for generalization both to 𝑁-parameter and to polynomial systems.…”
Section: Alternative Solution Methodsmentioning
confidence: 99%
“…We present some numerical examples obtained with Matlab. Several successful experiments with several types of multiparameter eigenvalue problems have been carried out and described in [16,12,14,26,13]. Therefore, we concentrate ourselves mostly on the new use for polynomial eigenvalue problems.…”
Section: Comparison With Other Approaches a Good Comparison Of Variou...mentioning
confidence: 99%
“…6.1(b). The eigenvalues are detected after 10,12,14,16,78,96,105,118,133,147,165, and 178 iterations. Elegantly, when we sort the eigenvalues with respect to distance to the target, these are eigenvalues number 1 through 12, in this order!…”
Section: Comparison With Other Approaches a Good Comparison Of Variou...mentioning
confidence: 99%
See 1 more Smart Citation
“…We compare our method with two existing approaches, Mathematica's NSolve [48] and PHCpack [46] in Section 7, and show that our approach is competitive for polynomials up to degree < ∼ 10. Let us mention that another advantage of writing the system of bivariate polynomials as a twoparameter eigenvalue problem is that then we can apply iterative subspace numerical methods such as the Jacobi-Davidson method and compute just a small part of zeros close to a given target (x 0 , y 0 ) [18]; we will not pursue this approach in this paper.…”
Section: Introductionmentioning
confidence: 99%