2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2017
DOI: 10.1109/icassp.2017.7952326
|View full text |Cite
|
Sign up to set email alerts
|

A semi-supervised method for multi-subject FMRI functional alignment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…Hyperalignment (DHA) [27]; Semi-Supervised SRM (SS-SRM) [31]; and Local Discriminant Hyperalignment (LDHA) [10].…”
Section: Methodmentioning
confidence: 99%
See 3 more Smart Citations
“…Hyperalignment (DHA) [27]; Semi-Supervised SRM (SS-SRM) [31]; and Local Discriminant Hyperalignment (LDHA) [10].…”
Section: Methodmentioning
confidence: 99%
“…Like the original paper [27], we consider 3 hidden layer (C = 5) for DHA, the number of units in the intermediate layers are considered L × V , and deep network is trained by using η = 10 −4 learning rate. For SS-SRM, we also considered γ = 1.0 and α = 0.5 [31]. It is worth noting that this paper employs BrainIAK library 1 for running SRM and SS-SRM.…”
Section: Methodmentioning
confidence: 99%
See 2 more Smart Citations
“…For example, a fast SRM implementation has been introduced for rapidly analyzing large datasets with reduced memory demands [43]. The robust SRM algorithm tolerates subject specific outlying response elements [44], and the semi-supervised SRM capitalizes on categorical stimulus labels when available [45]. Finally, estimating the SRM from functional connectivity data rather than response time series circumvents the need for a single shared stimulus across subjects; connectivity SRM allows us to derive a single shared response space across different stimuli with a shared connectivity profile [46].…”
Section: Shared Response Model (Srm)mentioning
confidence: 99%