2021
DOI: 10.1016/j.aca.2020.12.043
|View full text |Cite
|
Sign up to set email alerts
|

DeepReI: Deep learning-based gas chromatographic retention index predictor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(27 citation statements)
references
References 23 publications
1
26
0
Order By: Relevance
“…For classification-oriented deep neural networks, a softmax layer and a classification layer must follow the last FCL. Also, batch normalization (Vrzal et al, 2021) layers are embedded as assisting layers.…”
Section: Human Visual System and Attention Mechanismmentioning
confidence: 99%
“…For classification-oriented deep neural networks, a softmax layer and a classification layer must follow the last FCL. Also, batch normalization (Vrzal et al, 2021) layers are embedded as assisting layers.…”
Section: Human Visual System and Attention Mechanismmentioning
confidence: 99%
“…The l 2 -norm pooling (L2P), average pooling (AP) [14], and max pooling (MP) [15] produce the l 2 -norm, average, and maximum values within the block B m 1 ,m 2 , respectively. Their formula can be written as below:…”
Section: Stochastic Poolingmentioning
confidence: 99%
“…In contrast, MP stores the greatest value but deteriorates the overfitting obstacle. In order to solve the above concerns, stochastic pooling (SP) [15] is introduced to provide a resolution to the drawbacks of AP and MP. SP is a four-step process.…”
Section: Stochastic Poolingmentioning
confidence: 99%
See 1 more Smart Citation
“…An important task is the development of versatile RI prediction models that are applicable to almost arbitrary structures. There are several works (e.g., the most recent works [17][18][19]) that are devoted to RI prediction for diverse compounds and use data sets ranging in size from hundreds to tens of thousands of compounds. Most of such works, except for the most recent ones, are extensively reviewed in our previous work [17].…”
Section: Introductionmentioning
confidence: 99%