2012
DOI: 10.1002/mrm.24577
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating MR parameter mapping using sparsity‐promoting regularization in parametric dimension

Abstract: MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
131
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 119 publications
(132 citation statements)
references
References 45 publications
1
131
0
Order By: Relevance
“…1 The nuclear norm is applied to each matrix with regularization parameter λ and the result is summed. LLR regularization exploits spatial correlations in the temporal image coefficients, providing substantial dimensionality reduction beyond the capabilities of joint wavelet regularization or finite differences (21, 32, 44, 46). In effect, LLR constrains local image patches into a smaller space within the subspace Φ K , which compresses the representation to fewer than K coefficients per voxel.…”
Section: Theorymentioning
confidence: 99%
“…1 The nuclear norm is applied to each matrix with regularization parameter λ and the result is summed. LLR regularization exploits spatial correlations in the temporal image coefficients, providing substantial dimensionality reduction beyond the capabilities of joint wavelet regularization or finite differences (21, 32, 44, 46). In effect, LLR constrains local image patches into a smaller space within the subspace Φ K , which compresses the representation to fewer than K coefficients per voxel.…”
Section: Theorymentioning
confidence: 99%
“…Furthermore, the effects of different trade-offs in bias and variance for different approaches to exploit the tensor structure need to be investigated for particular applications, which is beyond the scope of this paper. Comparison of low-rank tensor imaging to other alternative high-dimensional imaging methods should also be investigated on an application-by-application basis (e.g., comparison to other multiparameter mapping methods [61], [62]).…”
Section: Discussionmentioning
confidence: 99%
“…Prior information, mostly the image sparsity [7] and spatiotemporal partial separability [8], has been exploited to constrain the solution space of the desired image function from the undersampled data. Based on similar assumptions (i.e., sparsity and partial separability), a number of sparse reconstruction methods [9][10][11][12][13][14][15] have been developed with various variations regarding the image model, sparsifying transform, regularization, etc. Specifically, authors in literature [9] proposed to learn an overcomplete dictionary to sparsify the signal.…”
Section: Introductionmentioning
confidence: 99%
“…The approach was verified in T 1 and T 2 mapping in the brain with highly reduced data. The study in literature [10] used the smoothness of signal evolution in the parametric dimension to accelerate variable flip angle T 1 mapping. Similar idea was developed in literature [13] to enable fast T 1 mapping of the mouse heart.…”
Section: Introductionmentioning
confidence: 99%