2019
DOI: 10.1007/s10822-019-00242-8
|View full text |Cite
|
Sign up to set email alerts
|

Focused Library Generator: case of Mdmx inhibitors

Abstract: We present a Focused Library Generator that is able to create from scratch new molecules with desired properties. After training the Generator on the ChEMBL database, transfer learning was used to switch the generator to producing new Mdmx inhibitors that are a promising class of anticancer drugs. Lilly medicinal chemistry filters, molecular docking, and a QSAR IC 50 model were used to refine the output of the Generator. Pharmacophore screening and molecular dynamics (MD) simulations were then used to further … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 63 publications
(61 reference statements)
0
8
0
Order By: Relevance
“…To validate the model, we sampled 500,000 ChEMBLlike SMILES (only 8,617 (1.7%) of them were canonical) from a generator [56] and checked how accurately the model can restore canonical SMILES for these molecules. We intentionally selected the generated SMILES keeping in mind possible applications of the proposed method in artificial intelligence-driven pipelines of de-novo drug development.…”
Section: Smiles Canonicalization Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…To validate the model, we sampled 500,000 ChEMBLlike SMILES (only 8,617 (1.7%) of them were canonical) from a generator [56] and checked how accurately the model can restore canonical SMILES for these molecules. We intentionally selected the generated SMILES keeping in mind possible applications of the proposed method in artificial intelligence-driven pipelines of de-novo drug development.…”
Section: Smiles Canonicalization Modelmentioning
confidence: 99%
“…To have a fast, robust, and explainable tool for its prediction and interpretation is highly desirable by both academia and industry. The Transformer-CNN model built on 1311 compounds had the following statistics: q 2 = 0.92 and RMSE p = 0.57 [64]. For demonstration of its interpretability we choose haloperidol-a well-known antipsychotic drug with 14 mg/l water solubility [65].…”
Section: Aqueous Solubilitymentioning
confidence: 99%
“…Since 2016, SMILES-based machinelearned methods are used to produce de novo molecules. These methods include Variational AutoEncoders (VAEs) [12], Recurrent Neural Network (RNN) [6,[13][14][15], Generative Adversarial Networks (GANs) [16] and reinforcement learning (RL) [17] or generate molecules based on molecular graph representation [18] as well as other many approaches as reviewed by [19]. Contrary to these earlier reports, we demonstrate herein that text learning on SMILES is highly efficient to explore the training space with a high degree of novelty.…”
Section: Introductionmentioning
confidence: 78%
“…The proposed approach can be also used via transfer learning to generate compounds for specific scaffolds (see e.g. [14][15][16][17]).…”
Section: Resultsmentioning
confidence: 99%
“…More recently, applications involving Deep Learning models have gained attention due to their ability to extract important features from raw data and handle complex tasks [17], such as drug design. For instance, the Focused Library Generator designed by Xia et al [18] was able to design new inhibitor molecules with desired properties from scratch. Stokes et al [19] implemented Deep Learning for antibiotic prediction, which led to the discovery of a structurally distant antibacterial molecule.…”
Section: Introductionmentioning
confidence: 99%