2015
DOI: 10.1016/j.procs.2015.07.313
|View full text |Cite
|
Sign up to set email alerts
|

Robust Discovery of Temporal Structure in Multi-neuron Recordings Using Hopfield Networks

Abstract: We present here a novel method for the classical task of extracting reoccurring spatiotemporal patterns from spiking activity of large populations of neurons. In contrast to previous studies that mainly focus on synchrony detection or exactly recurring binary patterns, we perform the search in an approximate way that clusters together nearby, noisy network states in the data. Our approach is to use minimum probability flow (MPF) parameter estimation to deterministically fit very large Hopfield networks on wind… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 40 publications
(21 reference statements)
0
7
0
Order By: Relevance
“…While approaches like frequent itemset mining and related methods [ 48 51 ] can find more patterns than the number of neurons and provide a rigorous statistical framework, they require that exact matches of the same pattern occur, which becomes less and less probable as the number of neurons grows or as the time bins become smaller (problem of combinatorial explosion). To address this problem, [ 52 , 53 ] proposed another promising unsupervised method based on spin glass Ising models that allows for approximate pattern matching while not being linearly limited in the number of patterns; this method however requires binning, and rather provides a method for classifying the binary network state vector in a small temporal neighborhood, while not dissociating rate patterns from temporal patterns.…”
Section: Introductionmentioning
confidence: 99%
“…While approaches like frequent itemset mining and related methods [ 48 51 ] can find more patterns than the number of neurons and provide a rigorous statistical framework, they require that exact matches of the same pattern occur, which becomes less and less probable as the number of neurons grows or as the time bins become smaller (problem of combinatorial explosion). To address this problem, [ 52 , 53 ] proposed another promising unsupervised method based on spin glass Ising models that allows for approximate pattern matching while not being linearly limited in the number of patterns; this method however requires binning, and rather provides a method for classifying the binary network state vector in a small temporal neighborhood, while not dissociating rate patterns from temporal patterns.…”
Section: Introductionmentioning
confidence: 99%
“…While approaches like frequent itemset mining and related methods (Grun, Diesmann, and Aertsen, 2002;Picado-Muino et al, 2013;Pipa et al, 2008;Torre et al, 2013) can find more patterns than the number of neurons and provide a rigorous statistical framework, they require that exact matches of the same pattern occur, which becomes less and less probable as the number of neurons grows or as the time bins become smaller (problem of combinatorial explosion). To address this problem, Effenberger and Hillar, 2015;Hillar and Effenberger, 2015 proposed another promising unsupervised method based on spin glass Ising models that allows for approximate pattern matching while not being linearly limited in the number of patterns; this method however requires binning, and rather provides a method for classifying the binary network state vector in a small temporal neighbourhood, while not dissociating rate patterns from temporal patterns.…”
Section: Introductionmentioning
confidence: 99%
“…With the appropriate discretization, a natural signal ensemble can be studied by learning a Hopfield network; for instance, in the pursuit of image compression [29,30], perceptual metrics [31], or rate-distortion analyses [32,33]. These networks and their memories can also be used to understand data from neuroscience experiments [34,35]. In particular, it is possible to uncover reoccurring spatiotemporal activity patterns in spontaneous neural activity.…”
Section: Natural Signal Modelingmentioning
confidence: 99%
“…There are several ways to motivate minimizing energy flow (7) to fit networks. Aside from several experimental [21,[30][31][32][33][34][35] and theoretical [19] works detailing its utility and properties, a direct explanation is that provably making EF small forces X to be attractors of the network (if they can be). It should be somewhat surprising that minimizing (7) forces nonlinear identities (4) of the dynamics.…”
Section: Definition 1 (Energy Flowmentioning
confidence: 99%