2019
DOI: 10.1007/978-3-030-32695-1_7
|View full text |Cite
|
Sign up to set email alerts
|

Deep Transfer Learning for Whole-Brain FMRI Analyses

Abstract: The application of deep learning (DL) models to the decoding of cognitive states from whole-brain functional Magnetic Resonance Imaging (fMRI) data is often hindered by the small sample size and high dimensionality of these datasets. Especially, in clinical settings, where patient data are scarce. In this work, we demonstrate that transfer learning represents a solution to this problem. Particularly, we show that a DL model, which has been previously trained on a large openly available fMRI dataset of the Huma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 14 publications
0
18
0
Order By: Relevance
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]). Additionally, Yang et al [ 157 ] ensembled CNNs and fine-tuned their individual contribution.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]). Additionally, Yang et al [ 157 ] ensembled CNNs and fine-tuned their individual contribution.…”
Section: Resultsmentioning
confidence: 99%
“…Likewise, the training set size in the target domain could be task-dependent. Only a few studies investigated the application of transfer learning with different training set sizes [ 99 , 101 , 128 , 129 , 130 , 135 , 147 , 150 ]. Among these, most articles reported that with a sufficiently large training set, models trained from scratch achieved similar or even better [ 99 , 135 ] results than applying transfer learning.…”
Section: Discussionmentioning
confidence: 99%
“…However, in the field of neuroimaging, collecting massive amounts of homogeneous data is infeasible thus constraining researchers to work with small data. In such cases, transfer learning 30,31,[55][56][57] is practically helpful to enable learning directly from data. Self-supervised learning has made significant progress in computer vision classification tasks 22,[58][59][60][61] and is equally applicable to deep convolutional and recurrent networks.…”
Section: /16mentioning
confidence: 99%
“…Transfer learning is a DL method that transfers knowledge from one domain (source domain) to another domain (target domain), so that better learning results can be achieved in the target domain. Since researchers in brain imaging deal with small datasets, transfer learning might be an approach that improves results [70][71][72].…”
Section: Transfer Learningmentioning
confidence: 99%