2016
DOI: 10.1007/978-3-319-46681-1_48
|View full text |Cite
|
Sign up to set email alerts
|

A Deep Neural Network Architecture Using Dimensionality Reduction with Sparse Matrices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…One challenge regards how to implement such algorithms in a fast online form. In this respect, we note that the mathematical functions we used here are suitable to be implemented in programmable hardware and can be even approximated by using artificial neural networks (Hinton and Salakhutdinov, 2006 ; Matsumoto et al, 2016 ), suggesting that they could be in principle implementable in adaptive and low-power consumption devices such as neuromorphic circuits (Chicca et al, 2014 ; Qiao et al, 2015 ; Boi et al, 2016 ). Another technological challenge regards how to track down online the temporal dynamics of pre-stimulus activity, which we showed to be useful to increase state-dependent information.…”
Section: Discussionmentioning
confidence: 99%
“…One challenge regards how to implement such algorithms in a fast online form. In this respect, we note that the mathematical functions we used here are suitable to be implemented in programmable hardware and can be even approximated by using artificial neural networks (Hinton and Salakhutdinov, 2006 ; Matsumoto et al, 2016 ), suggesting that they could be in principle implementable in adaptive and low-power consumption devices such as neuromorphic circuits (Chicca et al, 2014 ; Qiao et al, 2015 ; Boi et al, 2016 ). Another technological challenge regards how to track down online the temporal dynamics of pre-stimulus activity, which we showed to be useful to increase state-dependent information.…”
Section: Discussionmentioning
confidence: 99%