2023
DOI: 10.1039/d3sc03613h
|View full text |Cite
|
Sign up to set email alerts
|

Data-driven discovery of innate immunomodulators via machine learning-guided high throughput screening

Yifeng Tang,
Jeremiah Y. Kim,
Carman K. M. IP
et al.

Abstract: The innate immune response is vital for the success of prophylactic vaccines and immunotherapies. Control of signaling in innate immune pathways can improve prophylactic vaccines by inhibiting unfavorable systemic inflammation...

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 68 publications
(129 reference statements)
0
5
0
Order By: Relevance
“…The training process first requires a unique representation of the molecules using SELFIES and optimization of a loss function to learn a continuous representation of these molecules within a low-dimensional latent space constituting the information bottleneck layer between the encoder and the decoder. This latent space provides a smooth and low-dimensional representation of the molecular design space that is well suited to the construction of surrogate structure–property models and enables a Bayesian-guided traversal and optimization of molecules within the design space. ,,, We provide a brief description of the VAE model construction and training in the Supporting Information; full details are available in Tang et al…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…The training process first requires a unique representation of the molecules using SELFIES and optimization of a loss function to learn a continuous representation of these molecules within a low-dimensional latent space constituting the information bottleneck layer between the encoder and the decoder. This latent space provides a smooth and low-dimensional representation of the molecular design space that is well suited to the construction of surrogate structure–property models and enables a Bayesian-guided traversal and optimization of molecules within the design space. ,,, We provide a brief description of the VAE model construction and training in the Supporting Information; full details are available in Tang et al…”
Section: Methodsmentioning
confidence: 99%
“…The variational autoencoder (VAE) , model utilized in this study was previously trained on more than 1.2 M small molecules contained within the ZINC data set plus a number of common chemical screening libraries . It was our anticipation that this pretrained VAE over a large class of small molecules would provide good representations of the 3850 linear probe molecules considered in this work as rich but interpretable featurization appropriate for our active learning search.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…26,27 In response to these challenges, various generator architectures have garnered substantial interest. [28][29][30] Early work by Aspuru-Guzik et al on a SMILES-based variational autoencoder (VAE) opened avenues for optimized compound searches, albeit limited to small molecules. 31,32 Moreover, generator has been explored in ML-assisted material design as well but concentrate either on arbitrary design or theoretical properties.…”
mentioning
confidence: 99%