2020
DOI: 10.1002/mrm.28470
|View full text |Cite
|
Sign up to set email alerts
|

Extracting diffusion tensor fractional anisotropy and mean diffusivity from 3‐direction DWI scans using deep learning

Abstract: Purpose To develop and evaluate machine‐learning methods that reconstruct fractional anisotropy (FA) values and mean diffusivities (MD) from 3‐direction diffusion MRI (dMRI) acquisitions. Methods Two machine‐learning models were implemented to map undersampled dMRI signals with high‐quality FA and MD maps that were reconstructed from fully sampled DTI scans. The first model was a previously described multilayer perceptron (MLP), which maps signals and FA/MD values from a single voxel. The second was a convolut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 20 publications
(41 reference statements)
0
12
0
Order By: Relevance
“…Recent works that leverage supervised ML for model parameter estimation in qMRI typically employ one of two training data distributions: (1) parameter combinations obtained from traditional model fitting and the corresponding measured qMRI signals, 4,6,9,11,[14][15][16][17] or (2) parameters sampled uniformly from the entire plausible parameter space with simulated qMRI signals. 5,[18][19][20][21][22][23][24] While (1) uses parameter combinations directly estimated from the data so likely quantifies the model parameters with higher accuracy and precision for a given specific dataset, (2) supports choice of training data distribution, which may help improve generalizability and avoid problems arising from imbalance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent works that leverage supervised ML for model parameter estimation in qMRI typically employ one of two training data distributions: (1) parameter combinations obtained from traditional model fitting and the corresponding measured qMRI signals, 4,6,9,11,[14][15][16][17] or (2) parameters sampled uniformly from the entire plausible parameter space with simulated qMRI signals. 5,[18][19][20][21][22][23][24] While (1) uses parameter combinations directly estimated from the data so likely quantifies the model parameters with higher accuracy and precision for a given specific dataset, (2) supports choice of training data distribution, which may help improve generalizability and avoid problems arising from imbalance.…”
Section: Introductionmentioning
confidence: 99%
“…6 In dMRI, ML has been used, for example, to bridge the gap between data-hungry imaging techniques and clinically feasible scans, for example by reconstructing super-resolved maps from low spatial resolution data, 7,8 or by estimating advanced diffusion-based metrics from sparse q-space acquisitions. [9][10][11] Most ML methods used in qMRI are "supervised," i.e., rely on learning patterns from large training data sets of known corresponding inputs and outputs. A key issue with supervised ML is that in the absence of balanced training data, ML models may learn biased mappings.…”
Section: Introductionmentioning
confidence: 99%
“…124 In the field of diffusion analysis, DNNs were used as a function that received under-sampled q-space data as the input, and produced neurite orientation dispersion and density imaging (NODDI) and generalized fractional anisotropy (GFA) parameters as the outputs; 125 and a function that received diffusion-weighted images as the inputs, and produced fractional anisotropy (FA) and mean diffusivities (MD) parameters as the outputs. 126…”
Section: Parameter Mappingmentioning
confidence: 99%
“…The most popular application of such datasets has been within a supervised learning framework [1,3,9,12,[14][15][16]. This approach trains DNNs to predict groundtruth generative parameters from noisy qMRI signals.…”
Section: Introductionmentioning
confidence: 99%