2016
DOI: 10.1002/hbm.23268
|View full text |Cite
|
Sign up to set email alerts
|

Representing object categories by connections: Evidence from a mutivariate connectivity pattern classification approach

Abstract: The representation of object categories is a classical question in cognitive neuroscience and compelling evidence has identified specific brain regions showing preferential activation to categories of evolutionary significance. However, the potential contributions to category processing by tuning the connectivity patterns are largely unknown. Adopting a continuous multicategory paradigm, we obtained whole-brain functional connectivity (FC) patterns of each of four categories (faces, scenes, animals and tools) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
30
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(35 citation statements)
references
References 52 publications
(82 reference statements)
5
30
0
Order By: Relevance
“…In our study, we found that the linear SVM classifier achieved the best performance in all classification process, which suggested that the linear SVM was indeed a good classifier for fMRI data. This finding was consistent with majority of previous studies (Craddock et al, 2009 ; Wang et al, 2016 ; Saccà et al, 2017 ). For example, using the linear SVM classifier, patients with depression were successfully distinguished from healthy volunteers (R. Cameron Craddock), One recent study achieved high accuracy in object categories classification task using functional connections from task-related functional neuroimaging as features and SVM as the classifier.…”
Section: Discussionsupporting
confidence: 94%
“…In our study, we found that the linear SVM classifier achieved the best performance in all classification process, which suggested that the linear SVM was indeed a good classifier for fMRI data. This finding was consistent with majority of previous studies (Craddock et al, 2009 ; Wang et al, 2016 ; Saccà et al, 2017 ). For example, using the linear SVM classifier, patients with depression were successfully distinguished from healthy volunteers (R. Cameron Craddock), One recent study achieved high accuracy in object categories classification task using functional connections from task-related functional neuroimaging as features and SVM as the classifier.…”
Section: Discussionsupporting
confidence: 94%
“…Using multivariate pattern analysis and machine learning, we found that emotion information could be successfully decoded from the whole-brain FC patterns. Our results added to the recent FC-based decoding studies which found functional connectivity patterns could be used to discriminate different populations [14], task or mental states and different object categories [13], and further highlighted the effects of the whole-brain functional connectivity patterns in the emotion perception. Overall, our results provide new evidence that large-scale functional connectivity patterns also contain rich emotion information and effectively contribute to the recognition of emotions.…”
Section: Discussionsupporting
confidence: 64%
“…Then a component-based (CompCor) strategy was used to remove the non-neural confounders. The data were temporally filtered with a previously used for task-relate connectivity analysis 0.01 -0.1 HZ band-pass filter [13].We conducted ROI-to-ROI analysis to assess pairwise correlations between ROIs. Finally, we got 3 connection matrices (112 × 112) for each participant, one per emotion.…”
Section: Functional Connectivity Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…CAFPS was established in a similar way to the International Affective Picture System except that the models were all Chinese. CAFPS has been widely used in previous research ( Duan et al, 2010 ; Luo et al, 2010 ; Zhang et al, 2015 ; Wang et al, 2016 ). There are four males and four females faces in each category.…”
Section: Methodsmentioning
confidence: 99%