2024
DOI: 10.1021/acs.jcim.3c01685
|View full text |Cite
|
Sign up to set email alerts
|

Do Chemformers Dream of Organic Matter? Evaluating a Transformer Model for Multistep Retrosynthesis

Annie M. Westerlund,
Siva Manohar Koki,
Supriya Kancharla
et al.

Abstract: Synthesis planning of new pharmaceutical compounds is a wellknown bottleneck in modern drug design. Template-free methods, such as transformers, have recently been proposed as an alternative to template-based methods for single-step retrosynthetic predictions. Here, we trained and evaluated a transformer model, called the Chemformer, for retrosynthesis predictions within drug discovery. The proprietary data set used for training comprised ∼18 M reactions from literature, patents, and electronic lab notebooks. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 53 publications
0
2
0
Order By: Relevance
“…This feature offers the possibility of applying the most suitable contemporary single-step retrosynthesis model complementing the multi-step retrosynthetic process. The ModelZoo currently supports models such as the Chemformer [ 12 , 13 ], MHNreact [ 45 ] and LocalRetro [ 46 ]. Furthermore, we have implemented the functionality of incorporating multiple expansion strategies simultaneously.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This feature offers the possibility of applying the most suitable contemporary single-step retrosynthesis model complementing the multi-step retrosynthetic process. The ModelZoo currently supports models such as the Chemformer [ 12 , 13 ], MHNreact [ 45 ] and LocalRetro [ 46 ]. Furthermore, we have implemented the functionality of incorporating multiple expansion strategies simultaneously.…”
Section: Introductionmentioning
confidence: 99%
“…These improvements include a mechanism to prevent the formation of cycles when expanding the search tree. We have also implemented features that do not change the underlying algorithm but make the utilization of expensive models such as Chemformer more effective, including sibling node-expansion and model caching [ 12 ]. Moreover, we have expanded the search capabilities by incorporating additional search algorithms like the Breadth-First Search, Depth First Proof Number Search [ 53 ], and Retro* [ 50 ], within the search sub-package.…”
Section: Introductionmentioning
confidence: 99%