2018
DOI: 10.1109/trpms.2018.2843803
|View full text |Cite
|
Sign up to set email alerts
|

QR-Factorization Algorithm for Computed Tomography (CT): Comparison With FDK and Conjugate Gradient (CG) Algorithms

Abstract: Even though QR-factorization of the system matrix for tomographic devices has been already used for medical imaging, to date, no satisfactory solution has been found for solving large linear systems, such as those used in Computed Tomography (CT) (in the order of 10 6 equations). In computed tomography, the Feldkamp, Davis and Kress back projection algorithm (FDK) and iterative methods like conjugate gradient (CG) are the standard methods used for image reconstruction. As the image reconstruction problem can b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 42 publications
0
4
0
Order By: Relevance
“…This package gives us a lot of basic sparse matrices algebraic algorithms and also provides the possibility of using BLAS threads as well as Intel Threading Building Block (TBB) that can parallelize the factorization code and allows us to exploit our hardware resources to the maximum. In works like (Rodríguez-Alvarez et al, 2018) they perform the QR factorization to reconstruct small CT images but as they mention, the method is not parallelized or optimized to reduce the fill-in of the matrix, which the implementation in this packages does. In our previous work (Chillarón et al, 2018), we did a preliminary study in which we applied the multifrontal sparse QR (SPQR) method to reconstruct phantom images, in order to determine its feasibility.…”
Section: Introductionmentioning
confidence: 99%
“…This package gives us a lot of basic sparse matrices algebraic algorithms and also provides the possibility of using BLAS threads as well as Intel Threading Building Block (TBB) that can parallelize the factorization code and allows us to exploit our hardware resources to the maximum. In works like (Rodríguez-Alvarez et al, 2018) they perform the QR factorization to reconstruct small CT images but as they mention, the method is not parallelized or optimized to reduce the fill-in of the matrix, which the implementation in this packages does. In our previous work (Chillarón et al, 2018), we did a preliminary study in which we applied the multifrontal sparse QR (SPQR) method to reconstruct phantom images, in order to determine its feasibility.…”
Section: Introductionmentioning
confidence: 99%
“…Unlike direct methods, iterative methods often generate patchy or blocky artifacts in the reconstructed images due to overregularization [16], [20]. Therefore, direct algebraic methods such as the QR factorization [21], [22] have been explored recently. Although they usually require a greater number of views than the iterative ones (as was shown in a previous work [23]), they are much more accurate when the rank of the weights matrix is complete.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, direct algebraic methods such as the QR factorization [21], [22] have been explored recently. Although they usually require a greater number of views than the iterative ones (as was shown in a previous work [23]), they are much more accurate when the rank of the weights matrix is complete.…”
Section: Introductionmentioning
confidence: 99%
“…Using iterative methods, we can reduce the number of views or projections that we use to a really small number if we make a good selection of the projection angles [9,10,6]. However, with the direct methods [11,7], we have reached the conclusion that it is necessary that the matrix of the CT system has full rank, which will determine the number of projections required according to the image resolution that you want to achieve. Regardless, both approaches need a lower number of X-ray projections than other methods.…”
mentioning
confidence: 99%