2020
DOI: 10.1021/acs.chemrestox.0c00264
|View full text |Cite
|
Sign up to set email alerts
|

The Tox21 10K Compound Library: Collaborative Chemistry Advancing Toxicology

Abstract: Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
156
0
5

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 178 publications
(161 citation statements)
references
References 53 publications
0
156
0
5
Order By: Relevance
“…With more high-quality standardised data available, the (potential) impact of ML methods in regulatory toxicology is growing [4]. The collection of available toxicity data is increasing, thanks in part to high-throughput screening programs such as ToxCast [5] and Tox21 [6,7], but also with public-private partnerships such as the eTOX and eTRANSAFE projects, which focus on the sharing of (confidential) toxicity data and ML models across companies [8,9]. In any case, no matter which underlying data and ML method is used, it is essential to know or assess if the ML model can be reliably used to make predictions on a new dataset.…”
Section: Introductionmentioning
confidence: 99%
“…With more high-quality standardised data available, the (potential) impact of ML methods in regulatory toxicology is growing [4]. The collection of available toxicity data is increasing, thanks in part to high-throughput screening programs such as ToxCast [5] and Tox21 [6,7], but also with public-private partnerships such as the eTOX and eTRANSAFE projects, which focus on the sharing of (confidential) toxicity data and ML models across companies [8,9]. In any case, no matter which underlying data and ML method is used, it is essential to know or assess if the ML model can be reliably used to make predictions on a new dataset.…”
Section: Introductionmentioning
confidence: 99%
“…The interaction between endocrine-disrupting chemicals and nuclear–receptor family proteins can affect the endocrine system in the AOP [ 4 ]. The Toxicology in the 21st Century (Tox21) program is a US federal research collaboration between the US Environmental Protection Agency, the National Toxicology Program, the National Center for Advancing Translational Sciences, and the Food and Drug Administration that aims to develop toxicity assessment methods for commercial chemicals, pesticides, food additives/contaminants, and medical products using quantitative high-throughput screening [ 5 , 6 , 7 ]. The Tox21 10K library consists of approximately 10,000 (10K) chemicals, including nearly 100 million data points obtained by in vitro quantitative high-throughput screening, indicating the toxicological risk of chemical compounds obtained using an in silico approach [ 8 , 9 ].…”
Section: Introductionmentioning
confidence: 99%
“…The data must be well organised and structured in databases (e.g. Integrated Chemical Environment (ICE), 44 Tox21, 79 ToxCast, 80 ChEMBL, 70 PubChem 71 ) and made publicly available. 81 In addition, all of the scripts used to process or model the data should be available.…”
Section: Integrative Knowledge-driven Experimental Design For Reducing Animal Testingmentioning
confidence: 99%