2021
DOI: 10.1038/s42003-021-02437-y
|View full text |Cite
|
Sign up to set email alerts
|

Neocortical inhibitory interneuron subtypes are differentially attuned to synchrony- and rate-coded information

Abstract: Neurons can carry information with both the synchrony and rate of their spikes. However, it is unknown whether distinct subtypes of neurons are more sensitive to information carried by synchrony versus rate, or vice versa. Here, we address this question using patterned optical stimulation in slices of somatosensory cortex from mouse lines labelling fast-spiking (FS) and regular-spiking (RS) interneurons. We used optical stimulation in layer 2/3 to encode a 1-bit signal using either the synchrony or rate of act… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(11 citation statements)
references
References 58 publications
3
8
0
Order By: Relevance
“…This pattern is evident in both seed-based correlation analyses and graph theory metrics, where SOM cells show a higher CPL and transitivity, as well as a lower global efficiency, indicating less cortex wide communication and more segregation, than SLC, PV or VIP cells. A distinguishing phenotype of different cell populations is their spike frequency profile ( Prince et al, 2021 ). Although the characteristic high frequency spiking of interneurons (30-50Hz) cannot be directly captured with our 10Hz sampling rate, it has been suggested that lower frequency bands may still reflect some high frequency contributions ( Ali and Kwan, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
“…This pattern is evident in both seed-based correlation analyses and graph theory metrics, where SOM cells show a higher CPL and transitivity, as well as a lower global efficiency, indicating less cortex wide communication and more segregation, than SLC, PV or VIP cells. A distinguishing phenotype of different cell populations is their spike frequency profile ( Prince et al, 2021 ). Although the characteristic high frequency spiking of interneurons (30-50Hz) cannot be directly captured with our 10Hz sampling rate, it has been suggested that lower frequency bands may still reflect some high frequency contributions ( Ali and Kwan, 2019 ).…”
Section: Discussionmentioning
confidence: 99%
“…This pattern is evident in both seed-based correlation analyses and graph theory metrics, where SOM cells show a higher CPL and transitivity, as well as a lower global efficiency, than SLC, PV or VIP cells. A distinguishing phenotype of different cell populations is their spike frequency profile [73]. Although the characteristic high frequency spiking of interneurons (30-50Hz) cannot be directly captured with our 10Hz sampling rate, it has been suggested that lower frequency bands may still reflect some high frequency contributions [74].…”
Section: Discussionmentioning
confidence: 99%
“…Mutual information can also be understood as the amount of information provided by one signal about another, a formulation that is frequently applied to the quantification and modeling of coherence between neural signals, from unit spiking to EEG. [3][4][5][6] The basic mathematical details of entropy, joint entropy, and mutual information are reviewed in the appendix, accompanied by a valuable and extensible graphical analogy.…”
Section: Entropy Calculations and Derived Metricsmentioning
confidence: 99%
“…1 Shannon established the system-agnostic paradigm for analyzing the transmission of information that now underpins all modern digital communications and machine learning 2 and that has since been applied to investigate biologic information encoding in neuroscience and anesthesia. [3][4][5][6][7][8] Shannon's conceptualization of "entropy" as a measure of the overall information content of a given signal has been particularly useful in the investigation of communication within nervous systems. Entropy measures the information content of an individual signal, but further metrics derive naturally to quantify the information transfer between two signals.…”
mentioning
confidence: 99%