2019
DOI: 10.1109/jbhi.2018.2882885
|View full text |Cite
|
Sign up to set email alerts
|

Identifying Brain Networks at Multiple Time Scales via Deep Recurrent Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 71 publications
0
14
0
Order By: Relevance
“…In [32], a RNN based brain state recognition model is proposed to effectively separate different brain state. In [33], we proposed a deep recurrent neural network (DRNN) model to identify group-wise task-related functional brain networks and brain response patterns in tfMRI data. Although this group-wise activation detection framework is limited in characterizing individual specific brain networks, the identified brain response patterns are diverse and meaningful [33].…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…In [32], a RNN based brain state recognition model is proposed to effectively separate different brain state. In [33], we proposed a deep recurrent neural network (DRNN) model to identify group-wise task-related functional brain networks and brain response patterns in tfMRI data. Although this group-wise activation detection framework is limited in characterizing individual specific brain networks, the identified brain response patterns are diverse and meaningful [33].…”
Section: Introductionmentioning
confidence: 99%
“…In [33], we proposed a deep recurrent neural network (DRNN) model to identify group-wise task-related functional brain networks and brain response patterns in tfMRI data. Although this group-wise activation detection framework is limited in characterizing individual specific brain networks, the identified brain response patterns are diverse and meaningful [33]. These response patterns naturally account for the temporal dependencies of input stimulus and reflect the possible variations of brain response to input stimulus.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Recurrent neural networks (RNNs) have been extensively employed for the time series modeling in the recent years, due to their capability of carrying information from arbitrarily long contexts, selective information transfer across time steps, and affordable scalability when compared to stochastic models and feed-forward networks [41]- [44]. RNNs are seemingly efficient in modeling temporal contexts in time series data and have been recently used to perform many biomedical prediction tasks of various complexities [45]- [48], but nevertheless using RNNs on raw signals is extremely hard to optimize because of the propagation of error signals through huge number of time steps [49], [50]. To overcome this, convolutional neural networks (CNNs) have been utilized for the perception of short contexts and more abstraction before feeding into RNNs for the perception of longer temporal contexts [49].…”
mentioning
confidence: 99%