2023
DOI: 10.1002/ddr.22121
|View full text |Cite
|
Sign up to set email alerts
|

Artificial intelligence utility for drug development: ChatGPT and beyond

David Gurwitz,
Noam Shomron

Abstract: Generative pretrained transformer (GPT) tools, most notably ChatGPT, are making headlines as the next revolution in artificial intelligence (AI). It affects diverse research fields, from biology and medicine to exact sciences, economics, engineering, and other knowledge-based and technology-driven fields. Novel AI applications, along with challenges, are foreseen in the fields of biomedical

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 27 publications
(23 reference statements)
0
1
0
Order By: Relevance
“…More recently, generative AI language models, particularly Chat Generative Pre-trained Transformer (ChatGPT) [16][17][18], have garnered substantial interest with applications including performance of autonomous research [19], aiding drug discovery [20][21][22], bioinformatic analysis [23,24], among others [25][26][27]. Unlike AlphaFold2 and RoseTTAFold, ChatGPT is trained on natural language datasets and its neural network architecture is tailored towards understanding and generating natural language text rather than structural modelling.…”
Section: Introductionmentioning
confidence: 99%
“…More recently, generative AI language models, particularly Chat Generative Pre-trained Transformer (ChatGPT) [16][17][18], have garnered substantial interest with applications including performance of autonomous research [19], aiding drug discovery [20][21][22], bioinformatic analysis [23,24], among others [25][26][27]. Unlike AlphaFold2 and RoseTTAFold, ChatGPT is trained on natural language datasets and its neural network architecture is tailored towards understanding and generating natural language text rather than structural modelling.…”
Section: Introductionmentioning
confidence: 99%