Advances in computer sciences and hardware combined with equally significant developments in molecular biology and chemistry are providing toxicology with a powerful new tool box. This tool box of computational models promises to increase the efficiency and the effectiveness by which the hazards and risks of environmental chemicals are determined. Computational toxicology focuses on applying these tools across many scales, including vastly increasing the numbers of chemicals and the types of biological interactions that can be evaluated. In addition, knowledge of toxicity pathways gathered within the tool box will be directly applicable to the study of the biological responses across a range of dose levels, including those more likely to be representative of exposures to the human population. Progress in this field will facilitate the transformative shift called for in the recent report on toxicology in the 21st century by the National Research Council. This review surveys the state of the art in many areas of computational toxicology and points to several hurdles that will be important to overcome as the field moves forward. Proof-of-concept studies need to clearly demonstrate the additional predictive power gained from these tools. More researchers need to become comfortable working with both the data generating tools and the computational modeling capabilities, and regulatory authorities must show a willingness to the embrace new approaches as they gain scientific acceptance. The next few years should witness the early fruits of these efforts, but as the National Research Council indicates, the paradigm shift will take a long term investment and commitment to reach full potential.
Lifetime cancer or unit risk estimates for TRI have been calculated by the EPA on the basis of metabolized dose-tumor incidence relationships. Previously, it was common practice to directly extrapolate exposure dose-tumor incidence data from laboratory animal studies to predict cancer risks in humans. Such direct species-to-species extrapolations, however, do not take into account potentially important species differences in systemic uptake, tissue distribution, metabolism, deposition at the site(s) of action, and elimination. The consideration and use of pharmacokinetic and metabolic data can significantly reduce, though not eliminate, uncertainties inherent in species-to-species, route-to-route, and high- to low-dose extrapolations. The total amount of TRI metabolized was considered in the most recent EPA Health Assessment Document for Trichloroethylene to be the effective dose (EFD) producing tumors. Exposure dose-metabolism relationships were determined from direct measurement data in inhalation and oral dosing studies in mice and rats. The magnitude of TRI metabolism in these two species closely approximated body surface area. Thus, it was assumed that the amount of TRI metabolized per square meter of surface area was equivalent among species when calculating human equivalent doses from the animal data. Direct measurement data from an inhalation study in humans were used to calculate the amount of TRI metabolized and the unit risk estimate when a person inhales 1 microgram TRI per cubic meter continuously for 24 h. The EPA Cancer Assessment Group (CAG) elected to use this risk estimate for TRI in air, since it was calculated on the basis of a human metabolized dose rather than unit risk estimates based on animal studies. The current survey of literature and ongoing research uncovered no new animal or human studies in which TRI metabolites were directly measured, which would be any more suitable for use in estimating the total metabolized dose of TRI. On the basis of information now available, it is appropriate to continue to use the total amount of TRI metabolized as the EFD producing tumors in the liver. Use of the total amount metabolized represents an important "step in the right direction" in reducing uncertainties in interspecies extrapolations of data on a chemical such as TRI. TRI is believed to be metabolically activated to a reactive intermediate(s), although the identity of the intermediate(s) is unclear. There is evidence that formation of reactive intermediate(s) and TRI hepatotoxicity are directly proportional to the overall extent of TRI metabolism.(ABSTRACT TRUNCATED AT 400 WORDS)
For more than three decades chronic studies in rodents have been the benchmark for assessing the potential long-term toxicity, and particularly the carcinogenicity, of chemicals. With doses typically administered for about 2 years (18 months to lifetime), the rodent bioassay has been an integral component of testing protocols for food additives, pesticides, pharmaceuticals, industrial chemicals, and all manner of byproducts and environmental contaminants. Over time, the data from these studies have been used to address an increasing diversity of questions related to the assessment of human health risks, adding complexity to study design and interpretation. An earlier ILSI RSI working group developed a set of principles for the selection of doses for chronic rodent studies (ILSI, 1997). The present report builds on that work, examining some of the issues that arise and offering new perspectives and approaches for putting the principles into practice. Dose selection is considered both from the prospective viewpoint of the choosing of dose levels for a study and from the retrospective interpretation of study results in light of the doses used. A main theme of this report is that the purposes and objectives of chronic rodent studies vary and should be clearly defined in advance. Dose placement, then, should be optimized to achieve study objectives. For practical reasons, most chronic studies today must be designed to address multiple objectives, often requiring trade-offs and innovative approaches in study design. A systematic approach to dose selection should begin with recognition that the design of chronic studies occurs in the context of a careful assessment of the accumulated scientific information on the test substance, the relevant risk management questions, priorities and mandates, and the practical limitations and constraints on available resources. A stepwise process is described. The aim is to increase insofar as possible the utility of an expensive and time-consuming experiment. The kinds of data that are most commonly needed for dose selection and for understanding the dose-related results of chronic rodent studies, particularly carcinogenicity studies, are discussed as "design/interpretation factors." They comprise both the inherent characteristics of the test substance and indicators of biological damage, perturbation or stress among the experimental animals. They may be primary toxicity endpoints, predictors or indicators of appropriate dose selection, or indicators of conditions to be avoided in dose selection. The application and interpretation of design/interpretation factors is conditioned by the study objectives-what is considered desirable will depend on the strategy for choice of doses that is being followed. The challenge is to select doses that accommodate all of the issues raised by the relevant design/interpretation factors. Three case studies are presented here that illustrate the interplay between study objectives and the design and selection of doses for chronic rodent studies. These exa...
Physiologically based pharmacokinetic (PBPK) modeling is a well-established toxicological tool designed to relate exposure to a target tissue dose. The emergence of federal and state programs for environmental health tracking and the availability of exposure monitoring through biomarkers creates the opportunity to apply PBPK models to estimate exposures to environmental contaminants from urine, blood, and tissue samples. However, reconstructing exposures for large populations is complicated by often having too few biomarker samples, large uncertainties about exposures, and large interindividual variability. In this paper, we use an illustrative case study to identify some of these difficulties, and for a process for confronting them by reconstructing population-scale exposures using Bayesian inference. The application consists of interpreting biomarker data from eight adult males with controlled exposures to trichloroethylene (TCE) as if the biomarkers were random samples from a large population with unknown exposure conditions. The TCE concentrations in blood from the individuals fell into two distinctly different groups even though the individuals were simultaneously in a single exposure chamber. We successfully reconstructed the exposure scenarios for both subgroups F although the reconstruction of one subgroup is different than what is believed to be the true experimental conditions. We were however unable to predict with high certainty the concentration of TCE in air.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.