2022
DOI: 10.3389/fphar.2022.1046524
|View full text |Cite
|
Sign up to set email alerts
|

Application of SMILES-based molecular generative model in new drug design

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 19 publications
0
0
0
Order By: Relevance
“…SMILES strings have a wide range of applications in computer programs and databases, where they can be used for large-scale molecular analysis and processing, allowing researchers to process and analyze molecular data more efficiently. SMILES strings can also be used for visualization and editing of molecular structures, providing researchers with tools to intuitively understand and analyze molecular structures. …”
Section: Methodsmentioning
confidence: 99%
“…SMILES strings have a wide range of applications in computer programs and databases, where they can be used for large-scale molecular analysis and processing, allowing researchers to process and analyze molecular data more efficiently. SMILES strings can also be used for visualization and editing of molecular structures, providing researchers with tools to intuitively understand and analyze molecular structures. …”
Section: Methodsmentioning
confidence: 99%
“…This capacity enables RNNs to capture dynamic temporal behaviors crucial for analyzing timeseries data, genomic sequences, protein structures, and Simplified Molecular Input Line Entry System (SMILES) strings. [148] However, traditional RNNs encounter challenges in learning long-term dependencies within data sequences due to the vanishing gradient problem, which complicates the retention of information over extended sequences. [117] To surmount this limitation, advancements in RNN architecture have been introduced, including Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRUs).…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…Unlike traditional ANNs, RNNs are equipped with loops in their architecture, allowing the network to retain information from one step and pass it to the next, facilitating the processing of variable‐length input sequences. This capacity enables RNNs to capture dynamic temporal behaviors crucial for analyzing time‐series data, genomic sequences, protein structures, and Simplified Molecular Input Line Entry System (SMILES) strings [148] …”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…[3] However, RNN-based models have certain drawbacks: due to the structural issues of its model itself, problems such as gradient vanishing and gradient explosion are prone to occur when facing large datasets, and the final output is highly correlated with the last few inputs. [4] So we focused on VAE-based models. The models based on VAE reduct dimensionality to learn latent representations from input data .Therefore they can learn the probability distribution of the dataset and Reproduce the input data as much as possible.…”
Section: Introductionmentioning
confidence: 99%