2021
DOI: 10.1016/j.neuroimage.2020.117366
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty modelling in deep learning for safer neuroimage enhancement: Demonstration in diffusion MRI

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
83
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
3

Relationship

2
8

Authors

Journals

citations
Cited by 89 publications
(83 citation statements)
references
References 53 publications
0
83
0
Order By: Relevance
“…Alternatives or complements to the fully-connected DNN architecture should also be explored. Promising avenues include the use of dropout (Gal and Ghahramani, 2016;Tanno et al, 2021) or deep ensemble strategies (Lakshminarayanan et al, 2016;Qin et al, 2021) as a means to derive uncertainty metrics, the use of network structures inspired by non-learning-based iterative fitting frameworks (Ye, 2017), or use denoising networks (Fadnavis et al, 2020;Wang et al, 2019) to minimize the amount of noise present in the data that is supplied to the function-fitting DNN. However, the sensitivity analysis presented here should be applied to any new fitting strategy to test specificity to change in single model parameters.…”
Section: Discussionmentioning
confidence: 99%
“…Alternatives or complements to the fully-connected DNN architecture should also be explored. Promising avenues include the use of dropout (Gal and Ghahramani, 2016;Tanno et al, 2021) or deep ensemble strategies (Lakshminarayanan et al, 2016;Qin et al, 2021) as a means to derive uncertainty metrics, the use of network structures inspired by non-learning-based iterative fitting frameworks (Ye, 2017), or use denoising networks (Fadnavis et al, 2020;Wang et al, 2019) to minimize the amount of noise present in the data that is supplied to the function-fitting DNN. However, the sensitivity analysis presented here should be applied to any new fitting strategy to test specificity to change in single model parameters.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, our findings point out the importance of computing uncertainty, cf. [41], in MLbased estimation, particularly when ML is used to compensate for lower quality data.…”
Section: Summary Of Resultsmentioning
confidence: 99%
“…However, blindly trusting the result (output) of such a model risks different undetected failures (e.g., removal of structures and most important features) [25]. As expressed by Tanno et al [26] these predictive failures have two main reasons:…”
Section: Uncertainty Quantification In Deep Learningmentioning
confidence: 99%