2022
DOI: 10.1002/mrm.29547
|View full text |Cite
|
Sign up to set email alerts
|

An untrained deep learning method for reconstructing dynamic MR images from accelerated model‐based data

Abstract: Purpose To implement physics‐based regularization as a stopping condition in tuning an untrained deep neural network for reconstructing MR images from accelerated data. Methods The ConvDecoder (CD) neural network was trained with a physics‐based regularization term incorporating the spoiled gradient echo equation that describes variable‐flip angle data. Fully‐sampled variable‐flip angle k‐space data were retrospectively accelerated by factors of R = {8, 12, 18, 36} and reconstructed with CD, CD with the propos… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 74 publications
0
5
0
Order By: Relevance
“…When stopping the iteration process of the UNN on blade reconstruction needs more study. The recent work (Slavkova et al 2023) combining the UNN with a physicsbased regularization loss may be helpful in determining the optimal stopping point on each blade reconstruction and achieving the optimal PROPELLER UNN image quality.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…When stopping the iteration process of the UNN on blade reconstruction needs more study. The recent work (Slavkova et al 2023) combining the UNN with a physicsbased regularization loss may be helpful in determining the optimal stopping point on each blade reconstruction and achieving the optimal PROPELLER UNN image quality.…”
Section: Discussionmentioning
confidence: 99%
“…Additionally, our UNN can be integrated with model-based reconstruction methods. This unique feature allows us to determine the optimal training stopping point, all without relying on fully-sampled ground truth data (Slavkova et al 2023).…”
Section: Introductionmentioning
confidence: 99%
“…∥ • ∥ 1 is a sparse promoting regularizer. Instead of relying on the sparse nature of CS, UNN has proven effective by parameterizing it through a carefully designed deep neural network (Lempitsky et al 2018, Darestani and Heckel 2021, Slavkova et al 2023. Then, equation (7) can be reformulated in k-space as follows…”
Section: Latent Image Characterizationmentioning
confidence: 99%
“…Unsupervised generative convolutional neural networks (named as UNN) without the need for any training data have made promising strides in approximating low-level statistical priors, relying on the neural network structure for sparse regularization (Heckel and Hand 2018, Lempitsky et al 2018, Slavkova et al 2023. UNN has made significant progress in effectively addressing the challenging inverse problem encountered in one of the MR reconstructions (Darestani andHeckel 2021, Liu et al 2023).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation