2013 IEEE International Symposium on Circuits and Systems (ISCAS2013) 2013
DOI: 10.1109/iscas.2013.6571983
|View full text |Cite
|
Sign up to set email alerts
|

Reconfigurable biological signal co-processor for feature extraction dedicated to implantable biomedical microsystems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
0
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 10 publications
0
0
0
Order By: Relevance
“…Notably, the structures of the fast-path layer and the slow-path layer mainly conclude four components, i.e., the LSTM module, the self-attention module, the dense layer module, and the feature extraction module. Particularly, the circuit implementation of the dense layer module and the feature extraction module is relatively simple and easy to achieve [19,20]. The specific implementation scheme can be found in [19,20], and we will not repeat any details here.…”
Section: Memristor-based Hierarchical Attention Network For Affective...mentioning
confidence: 99%
See 2 more Smart Citations
“…Notably, the structures of the fast-path layer and the slow-path layer mainly conclude four components, i.e., the LSTM module, the self-attention module, the dense layer module, and the feature extraction module. Particularly, the circuit implementation of the dense layer module and the feature extraction module is relatively simple and easy to achieve [19,20]. The specific implementation scheme can be found in [19,20], and we will not repeat any details here.…”
Section: Memristor-based Hierarchical Attention Network For Affective...mentioning
confidence: 99%
“…Particularly, the circuit implementation of the dense layer module and the feature extraction module is relatively simple and easy to achieve [19,20]. The specific implementation scheme can be found in [19,20], and we will not repeat any details here. This section mainly focuses on the circuit design of LSTM module and self-attention module, due to the fact that the relative research is insufficient and challenging.…”
Section: Memristor-based Hierarchical Attention Network For Affective...mentioning
confidence: 99%
See 1 more Smart Citation