2019
DOI: 10.1016/j.brainres.2018.12.012
|View full text |Cite
|
Sign up to set email alerts
|

Separating normosmic and anosmic patients based on entropy evaluation of olfactory event-related potentials

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 34 publications
0
7
0
Order By: Relevance
“…Because of its general applicability, information theory, in particular, has been widely used in neuroscience [812] , [813] , [814] , [815] , [816] , [817] . For instance, these sorts of mathematical tools have extensively used for analyzing data from electroencephalography (EEG), magnetoencephalography (MEG), and functional magnetic resonance imaging (fMRI) [818] , [819] , [820] . These analyses, in turn, quantifies phenomena like encoding (e.g., how much information a neuron provides [821] , [822] ), complex encoding relationships [823] , [824] , [825] , [817] , studies of neural connectivity [826] , [827] , [828] , [829] and sensory encoding [821] , [830] , [831] , [832] .…”
Section: New Aspects In Information Processingmentioning
confidence: 99%
“…Because of its general applicability, information theory, in particular, has been widely used in neuroscience [812] , [813] , [814] , [815] , [816] , [817] . For instance, these sorts of mathematical tools have extensively used for analyzing data from electroencephalography (EEG), magnetoencephalography (MEG), and functional magnetic resonance imaging (fMRI) [818] , [819] , [820] . These analyses, in turn, quantifies phenomena like encoding (e.g., how much information a neuron provides [821] , [822] ), complex encoding relationships [823] , [824] , [825] , [817] , studies of neural connectivity [826] , [827] , [828] , [829] and sensory encoding [821] , [830] , [831] , [832] .…”
Section: New Aspects In Information Processingmentioning
confidence: 99%
“…The key issue would be to achieve a proper signal-to-noise ratio with a reduced number of trials. The literature suggests ERP analysis techniques based on single epoch analysis [ 63 , 65 , 99 ], improving the grand average with a priori knowledge [ 62 , 66 ], time–frequency analysis techniques (i.e., wavelet) [ 100 , 101 ], and Shannon entropy [ 102 ].…”
Section: Challenges and Future Goalsmentioning
confidence: 99%
“…w = r(rmin max * normrnd( ) + σ * randm( )) min (16) where rmin max represent the maximum and minimum of the randomly selected weights, normrnd is the random number and σ equals the bias. Random weight obeys Gaussian distribution, namely w ∼ N(θ, σ).…”
Section: Improved Csmentioning
confidence: 99%
“…First of all, the entropy weight method is used to determine the index weight so as to quantify the estimation and reduce the deviation caused by human factors [16]. SPA is able to describe the internal correlation via connection degree [17].…”
Section: Introductionmentioning
confidence: 99%