The U.S. Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS), and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. This chemical prioritization research program, entitled "ToxCast," is being initiated with the purpose of developing the ability to forecast toxicity based on bioactivity profiling. The proof-of-concept phase of ToxCast will focus upon chemicals with an existing, rich toxicological database in order to provide an interpretive context for the ToxCast data. This set of several hundred reference chemicals will represent numerous structural classes and phenotypic outcomes, including tumorigens, developmental and reproductive toxicants, neurotoxicants, and immunotoxicants. The ToxCast program will evaluate chemical properties and bioactivity profiles across a broad spectrum of data domains: physical-chemical, predicted biological activities based on existing structure-activity models, biochemical properties based on HTS assays, cell-based phenotypic assays, and genomic and metabolomic analyses of cells. These data will be generated through a series of external contracts, along with collaborations across EPA, with the National Toxicology Program, and with the National Institutes of Health Chemical Genomics Center. The resulting multidimensional data set provides an informatics challenge requiring appropriate computational methods for integrating various chemical, biological, and toxicological data into profiles and models predicting toxicity.
Background: In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency’s National Center for Computational Toxicology, and the National Human Genome Research Institute/National Institutes of Health Chemical Genomics Center entered into an agreement on “high throughput screening, toxicity pathway profiling, and biological interpretation of findings.” In 2010, the U.S. Food and Drug Administration (FDA) joined the collaboration, known informally as Tox21.Objectives: The Tox21 partners agreed to develop a vision and devise an implementation strategy to shift the assessment of chemical hazards away from traditional experimental animal toxicology studies to one based on target-specific, mechanism-based, biological observations largely obtained using in vitro assays.Discussion: Here we outline the efforts of the Tox21 partners up to the time the FDA joined the collaboration, describe the approaches taken to develop the science and technologies that are currently being used, assess the current status, and identify problems that could impede further progress as well as suggest approaches to address those problems.Conclusion: Tox21 faces some very difficult issues. However, we are making progress in integrating data from diverse technologies and end points into what is effectively a systems-biology approach to toxicology. This can be accomplished only when comprehensive knowledge is obtained with broad coverage of chemical and biological/toxicological space. The efforts thus far reflect the initial stage of an exceedingly complicated program, one that will likely take decades to fully achieve its goals. However, even at this stage, the information obtained has attracted the attention of the international scientific community, and we believe these efforts foretell the future of toxicology.
BackgroundChemical toxicity testing is being transformed by advances in biology and computer modeling, concerns over animal use, and the thousands of environmental chemicals lacking toxicity data. The U.S. Environmental Protection Agency’s ToxCast program aims to address these concerns by screening and prioritizing chemicals for potential human toxicity using in vitro assays and in silico approaches.ObjectivesThis project aims to evaluate the use of in vitro assays for understanding the types of molecular and pathway perturbations caused by environmental chemicals and to build initial prioritization models of in vivo toxicity.MethodsWe tested 309 mostly pesticide active chemicals in 467 assays across nine technologies, including high-throughput cell-free assays and cell-based assays, in multiple human primary cells and cell lines plus rat primary hepatocytes. Both individual and composite scores for effects on genes and pathways were analyzed.ResultsChemicals displayed a broad spectrum of activity at the molecular and pathway levels. We saw many expected interactions, including endocrine and xenobiotic metabolism enzyme activity. Chemicals ranged in promiscuity across pathways, from no activity to affecting dozens of pathways. We found a statistically significant inverse association between the number of pathways perturbed by a chemical at low in vitro concentrations and the lowest in vivo dose at which a chemical causes toxicity. We also found associations between a small set of in vitro assays and rodent liver lesion formation.ConclusionsThis approach promises to provide meaningful data on the thousands of untested environmental chemicals and to guide targeted testing of environmental contaminants.
The hypothesis has been put forward that humans and wildlife species adverse suffered adverse health effects after exposure to endocrine-disrupting chemicals. Reported adverse effects include declines in populations, increases in cancers, and reduced reproductive function. The U.S. Environmental Protection Agency sponsored a workshop in April 1995 to bring together interested parties in an effort to identify research gaps related to this hypothesis and to establish priorities for future research activities. Approximately 90 invited participants were organized into work groups developed around the principal reported health effects-carcinogenesis, reproductive toxicity, neurotoxicity, and immunotoxicity-as well as along the risk assessment paradigm-hazard identification, dose-response assessment, exposure assessment, and risk characterization. Attention focused on both ecological and human health effects. In general, group felt that the hypothesis warranted a concerted research effort to evaluate its validity and that research should focus primarily on effects on development of reproductive capability, on improved exposure assessment, and on the effects of mixtures. This report summarizes the discussions of the work groups and details the recommendations for additional research.
Background:A recent review by the International Agency for Research on Cancer (IARC) updated the assessments of the > 100 agents classified as Group 1, carcinogenic to humans (IARC Monographs Volume 100, parts A–F). This exercise was complicated by the absence of a broadly accepted, systematic method for evaluating mechanistic data to support conclusions regarding human hazard from exposure to carcinogens.Objectivesand Methods: IARC therefore convened two workshops in which an international Working Group of experts identified 10 key characteristics, one or more of which are commonly exhibited by established human carcinogens.Discussion:These characteristics provide the basis for an objective approach to identifying and organizing results from pertinent mechanistic studies. The 10 characteristics are the abilities of an agent to 1) act as an electrophile either directly or after metabolic activation; 2) be genotoxic; 3) alter DNA repair or cause genomic instability; 4) induce epigenetic alterations; 5) induce oxidative stress; 6) induce chronic inflammation; 7) be immunosuppressive; 8) modulate receptor-mediated effects; 9) cause immortalization; and 10) alter cell proliferation, cell death, or nutrient supply.Conclusion:We describe the use of the 10 key characteristics to conduct a systematic literature search focused on relevant end points and construct a graphical representation of the identified mechanistic information. Next, we use benzene and polychlorinated biphenyls as examples to illustrate how this approach may work in practice. The approach described is similar in many respects to those currently being implemented by the U.S. EPA’s Integrated Risk Information System Program and the U.S. National Toxicology Program.Citation:Smith MT, Guyton KZ, Gibbons CF, Fritz JM, Portier CJ, Rusyn I, DeMarini DM, Caldwell JC, Kavlock RJ, Lambert P, Hecht SS, Bucher JR, Stewart BW, Baan R, Cogliano VJ, Straif K. 2016. Key characteristics of carcinogens as a basis for organizing data on mechanisms of carcinogenesis. Environ Health Perspect 124:713–721; http://dx.doi.org/10.1289/ehp.1509912
High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, clearance, and exposure. Hepatic metabolic clearance and plasma protein binding were experimentally measured for 239 ToxCast Phase I chemicals. The experimental data were used in a population-based in vitro-to-in vivo extrapolation model to estimate the daily human oral dose, called the oral equivalent dose, necessary to produce steady-state in vivo blood concentrations equivalent to in vitro AC(50) (concentration at 50% of maximum activity) or lowest effective concentration values across more than 500 in vitro assays. The estimated steady-state oral equivalent doses associated with the in vitro assays were compared with chronic aggregate human oral exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. A total of 18 (9.9%) chemicals for which human oral exposure estimates were available had oral equivalent doses at levels equal to or less than the highest estimated U.S. population exposures. Ranking the chemicals by nominal assay concentrations would have resulted in different chemicals being prioritized. The in vitro assay endpoints with oral equivalent doses lower than the human exposure estimates included cell growth kinetics, cytokine and cytochrome P450 expression, and cytochrome P450 inhibition. The incorporation of dosimetry and exposure provide necessary context for interpretation of in vitro toxicity screening data and are important considerations in determining chemical testing priorities.
The field of toxicology is on the cusp of a major transformation in how the safety and hazard of chemicals are evaluated for potential effects on human health and the environment. Brought on by the recognition of the limitations of the current paradigm in terms of cost, time, and throughput, combined with the ever increasing power of modern biological tools to probe mechanisms of chemical-biological interactions at finer and finer resolutions, 21st century toxicology is rapidly taking shape. A key element of the new approach is a focus on the molecular and cellular pathways that are the targets of chemical interactions. By understanding toxicity in this manner, we begin to learn how chemicals cause toxicity, as opposed to merely what diseases or health effects they might cause. This deeper understanding leads to increasing confidence in identifying which populations might be at risk, significant susceptibility factors, and key influences on the shape of the dose-response curve. The U. S. Environmental Protection Agency (EPA) initiated the ToxCast, or "toxicity forecaster", program 5 years ago to gain understanding of the strengths and limitations of the new approach by starting to test relatively large numbers (hundreds) of chemicals against an equally large number of biological assays. Using computational approaches, the EPA is building decision support tools based on ToxCast in vitro screening results to help prioritize chemicals for further investigation, as well as developing predictive models for a number of health outcomes. This perspective provides a summary of the initial, proof of concept, Phase I of ToxCast that has laid the groundwork for the next phases and future directions of the program.
ObjectiveThousands of chemicals are in common use, but only a portion of them have undergone significant toxicologic evaluation, leading to the need to prioritize the remainder for targeted testing. To address this issue, the U.S. Environmental Protection Agency (EPA) and other organizations are developing chemical screening and prioritization programs. As part of these efforts, it is important to catalog, from widely dispersed sources, the toxicology information that is available. The main objective of this analysis is to define a list of environmental chemicals that are candidates for the U.S. EPA screening and prioritization process, and to catalog the available toxicology information.Data sourcesWe are developing ACToR (Aggregated Computational Toxicology Resource), which combines information for hundreds of thousands of chemicals from > 200 public sources, including the U.S. EPA, National Institutes of Health, Food and Drug Administration, corresponding agencies in Canada, Europe, and Japan, and academic sources.Data extractionACToR contains chemical structure information; physical–chemical properties; in vitro assay data; tabular in vivo data; summary toxicology calls (e.g., a statement that a chemical is considered to be a human carcinogen); and links to online toxicology summaries. Here, we use data from ACToR to assess the toxicity data landscape for environmental chemicals.Data synthesisWe show results for a set of 9,912 environmental chemicals being considered for analysis as part of the U.S. EPA ToxCast screening and prioritization program. These include high-and medium-production-volume chemicals, pesticide active and inert ingredients, and drinking water contaminants.ConclusionsApproximately two-thirds of these chemicals have at least limited toxicity summaries available. About one-quarter have been assessed in at least one highly curated toxicology evaluation database such as the U.S. EPA Toxicology Reference Database, U.S. EPA Integrated Risk Information System, and the National Toxicology Program.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.