2006
DOI: 10.1016/j.amc.2006.01.071
|View full text |Cite
|
Sign up to set email alerts
|

An efficient algorithm for the least-squares reflexive solution of the matrix equation A1XB1=C1, A2XB2=C2

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(18 citation statements)
references
References 10 publications
0
18
0
Order By: Relevance
“…Recently, some finite iterative algorithms have also been developed to solve some coupled matrix equations. In [9], an algorithm was constructed to solve the reflexive with respect to the generalized reflection matrix P solution for matrix equation (AXB, GXH) = (C, D). By this algorithm, a solution can be obtained within finite iteration steps for any initial reflexive matrix.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, some finite iterative algorithms have also been developed to solve some coupled matrix equations. In [9], an algorithm was constructed to solve the reflexive with respect to the generalized reflection matrix P solution for matrix equation (AXB, GXH) = (C, D). By this algorithm, a solution can be obtained within finite iteration steps for any initial reflexive matrix.…”
Section: Introductionmentioning
confidence: 99%
“…Least squares solutions to linear matrix equations or coupled linear matrix equations, have also been well studied in the literature in the past few years. Conjugated-gradient based iterative methods are developed in [26,27] respectively for solving the least squares symmetric solution of the linear matrix equation AXB = C and the least-squares reflexive solution of the matrix equations…”
Section: Introductionmentioning
confidence: 99%
“…He also indicated that if A ∈ C m×n is a generalized reflexive matrix and rank(A) = n, then its least-squares problem can be reduced to two independent least-squares problems for matrices of smaller dimensions. This dimension-reduced idea was applied to solve the singular value decomposition [2], the inverse eigenvalue problem [3], and the least-squares problem [4].…”
Section: Introductionmentioning
confidence: 99%