Advances in computer sciences and hardware combined with equally significant developments in molecular biology and chemistry are providing toxicology with a powerful new tool box. This tool box of computational models promises to increase the efficiency and the effectiveness by which the hazards and risks of environmental chemicals are determined. Computational toxicology focuses on applying these tools across many scales, including vastly increasing the numbers of chemicals and the types of biological interactions that can be evaluated. In addition, knowledge of toxicity pathways gathered within the tool box will be directly applicable to the study of the biological responses across a range of dose levels, including those more likely to be representative of exposures to the human population. Progress in this field will facilitate the transformative shift called for in the recent report on toxicology in the 21st century by the National Research Council. This review surveys the state of the art in many areas of computational toxicology and points to several hurdles that will be important to overcome as the field moves forward. Proof-of-concept studies need to clearly demonstrate the additional predictive power gained from these tools. More researchers need to become comfortable working with both the data generating tools and the computational modeling capabilities, and regulatory authorities must show a willingness to the embrace new approaches as they gain scientific acceptance. The next few years should witness the early fruits of these efforts, but as the National Research Council indicates, the paradigm shift will take a long term investment and commitment to reach full potential.
Lifetime cancer or unit risk estimates for TRI have been calculated by the EPA on the basis of metabolized dose-tumor incidence relationships. Previously, it was common practice to directly extrapolate exposure dose-tumor incidence data from laboratory animal studies to predict cancer risks in humans. Such direct species-to-species extrapolations, however, do not take into account potentially important species differences in systemic uptake, tissue distribution, metabolism, deposition at the site(s) of action, and elimination. The consideration and use of pharmacokinetic and metabolic data can significantly reduce, though not eliminate, uncertainties inherent in species-to-species, route-to-route, and high- to low-dose extrapolations. The total amount of TRI metabolized was considered in the most recent EPA Health Assessment Document for Trichloroethylene to be the effective dose (EFD) producing tumors. Exposure dose-metabolism relationships were determined from direct measurement data in inhalation and oral dosing studies in mice and rats. The magnitude of TRI metabolism in these two species closely approximated body surface area. Thus, it was assumed that the amount of TRI metabolized per square meter of surface area was equivalent among species when calculating human equivalent doses from the animal data. Direct measurement data from an inhalation study in humans were used to calculate the amount of TRI metabolized and the unit risk estimate when a person inhales 1 microgram TRI per cubic meter continuously for 24 h. The EPA Cancer Assessment Group (CAG) elected to use this risk estimate for TRI in air, since it was calculated on the basis of a human metabolized dose rather than unit risk estimates based on animal studies. The current survey of literature and ongoing research uncovered no new animal or human studies in which TRI metabolites were directly measured, which would be any more suitable for use in estimating the total metabolized dose of TRI. On the basis of information now available, it is appropriate to continue to use the total amount of TRI metabolized as the EFD producing tumors in the liver. Use of the total amount metabolized represents an important "step in the right direction" in reducing uncertainties in interspecies extrapolations of data on a chemical such as TRI. TRI is believed to be metabolically activated to a reactive intermediate(s), although the identity of the intermediate(s) is unclear. There is evidence that formation of reactive intermediate(s) and TRI hepatotoxicity are directly proportional to the overall extent of TRI metabolism.(ABSTRACT TRUNCATED AT 400 WORDS)
Physiologically based pharmacokinetic (PBPK) modeling is a well-established toxicological tool designed to relate exposure to a target tissue dose. The emergence of federal and state programs for environmental health tracking and the availability of exposure monitoring through biomarkers creates the opportunity to apply PBPK models to estimate exposures to environmental contaminants from urine, blood, and tissue samples. However, reconstructing exposures for large populations is complicated by often having too few biomarker samples, large uncertainties about exposures, and large interindividual variability. In this paper, we use an illustrative case study to identify some of these difficulties, and for a process for confronting them by reconstructing population-scale exposures using Bayesian inference. The application consists of interpreting biomarker data from eight adult males with controlled exposures to trichloroethylene (TCE) as if the biomarkers were random samples from a large population with unknown exposure conditions. The TCE concentrations in blood from the individuals fell into two distinctly different groups even though the individuals were simultaneously in a single exposure chamber. We successfully reconstructed the exposure scenarios for both subgroups F although the reconstruction of one subgroup is different than what is believed to be the true experimental conditions. We were however unable to predict with high certainty the concentration of TCE in air.
For more than three decades chronic studies in rodents have been the benchmark for assessing the potential long-term toxicity, and particularly the carcinogenicity, of chemicals. With doses typically administered for about 2 years (18 months to lifetime), the rodent bioassay has been an integral component of testing protocols for food additives, pesticides, pharmaceuticals, industrial chemicals, and all manner of byproducts and environmental contaminants. Over time, the data from these studies have been used to address an increasing diversity of questions related to the assessment of human health risks, adding complexity to study design and interpretation. An earlier ILSI RSI working group developed a set of principles for the selection of doses for chronic rodent studies (ILSI, 1997). The present report builds on that work, examining some of the issues that arise and offering new perspectives and approaches for putting the principles into practice. Dose selection is considered both from the prospective viewpoint of the choosing of dose levels for a study and from the retrospective interpretation of study results in light of the doses used. A main theme of this report is that the purposes and objectives of chronic rodent studies vary and should be clearly defined in advance. Dose placement, then, should be optimized to achieve study objectives. For practical reasons, most chronic studies today must be designed to address multiple objectives, often requiring trade-offs and innovative approaches in study design. A systematic approach to dose selection should begin with recognition that the design of chronic studies occurs in the context of a careful assessment of the accumulated scientific information on the test substance, the relevant risk management questions, priorities and mandates, and the practical limitations and constraints on available resources. A stepwise process is described. The aim is to increase insofar as possible the utility of an expensive and time-consuming experiment. The kinds of data that are most commonly needed for dose selection and for understanding the dose-related results of chronic rodent studies, particularly carcinogenicity studies, are discussed as "design/interpretation factors." They comprise both the inherent characteristics of the test substance and indicators of biological damage, perturbation or stress among the experimental animals. They may be primary toxicity endpoints, predictors or indicators of appropriate dose selection, or indicators of conditions to be avoided in dose selection. The application and interpretation of design/interpretation factors is conditioned by the study objectives-what is considered desirable will depend on the strategy for choice of doses that is being followed. The challenge is to select doses that accommodate all of the issues raised by the relevant design/interpretation factors. Three case studies are presented here that illustrate the interplay between study objectives and the design and selection of doses for chronic rodent studies. These exa...
A search of the scientific literature was carried out for physiochemical and biological data [i.e., IC50, LD50, Kp (cm/h) for percutaneous absorption, skin/water and tissue/blood partition coefficients, inhibition ki values, and metabolic parameters such as Vmax and Km] on 31 organophosphorus pesticides (OPs) to support the development of predictive quantitative structure-activity relationship (QSAR) and physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) models for human risk assessment. Except for work on parathion, chlorpyrifos, and isofenphos, very few modeling data were found on the 31 OPs of interest. The available percutaneous absorption, partition coefficients and metabolic parameters were insufficient in number to develop predictive QSAR models. Metabolic kinetic parameters (Vmax, Km) varied according to enzyme source and the manner in which the enzymes were characterized. The metabolic activity of microsomes should be based on the kinetic activity of purified or cDNA-expressed cytochrome P450s (CYPs) and the specific content of each active CYP in tissue microsomes. Similar requirements are needed to assess the activity of tissue A- and B-esterases metabolizing OPs. A limited amount of acetylcholinesterase (AChE), butyrylcholinesterase (BChE), and carboxylesterase (CaE) inhibition and recovery data were found in the literature on the 31 OPs. A program is needed to require the development of physicochemical and biological data to support risk assessment methodologies involving QSAR and PBPK/PD models.
In this review we have examined the status of parameters required by pyrethroid QSAR-PBPK/PD models for assessing health risks. In lieu of the chemical,biological, biochemical, and toxicological information developed on the pyrethroids since 1968, the finding of suitable parameters for QSAR and PBPK/PD model development was a monumental task. The most useful information obtained came from rat toxicokinetic studies (i.e., absorption, distribution, and excretion), metabolism studies with 14C-cyclopropane- and alcohol-labeled pyrethroids, the use of known chiral isomers in the metabolism studies and their relation to commercial products. In this review we identify the individual chiralisomers that have been used in published studies and the chiral HPLC columns available for separating them. Chiral HPLC columns are necessary for isomer identification and for developing kinetic values (Vm,, and Kin) for pyrethroid hydroxylation. Early investigators synthesized analytical standards for key pyrethroid metabolites, and these were used to confirm the identity of urinary etabolites, by using TLC. These analytical standards no longer exist, and muste resynthesized if further studies on the kinetics of the metabolism of pyrethroids are to be undertaken.In an attempt to circumvent the availability of analytical standards, several CYP450 studies were carried out using the substrate depletion method. This approach does not provide information on the products formed downstream, and may be of limited use in developing human environmental exposure PBPK/PD models that require extensive urinary metabolite data. Hydrolytic standards (i.e., alcohols and acids) were available to investigators who studied the carboxylesterase-catalyzed hydrolysis of several pyrethroid insecticides. The data generated in these studies are suitable for use in developing human exposure PBPK/PD models.Tissue:blood partition coefficients were developed for the parent pyrethroids and their metabolites, by using a published mechanistic model introduced by Poulin and Thiele (2002a; b) and log DpH 7.4 values. The estimated coefficients, especially those of adipose tissue, were too high and had to be corrected by using a procedure in which the proportion of parent or metabolite residues that are unbound to plasma albumin is considered, as described in the GastroPlus model (Simulations Plus, Inc.,Lancaster, CA). The literature suggested that Km values be adjusted by multiplying Km by the substrate (decimal amount) that is unbound to microsomal or CYPprotein. Mirfazaelian et al. (2006) used flow- and diffusion-limited compartments in their deltamethrin model. The addition of permeability areas (PA) having diffusion limits, such as the fat and slowly perfused compartments, enabled the investigators to bring model predictions in line with in vivo data.There appears to be large differences in the manner and rate of absorption of the pyrethroids from the gastrointestinal tract, implying that GI advanced compartmental transit models (ACAT) need to be included in PBPK mo...
Our interest in providing parameters for the development of quantitative structure physiologically based pharmacokinetic/pharmacodynamic (QSPBPK/PD) models for assessing health risks to carbamates (USEPA 2005) comes from earlier work with organophosphorus (OP) insecticides (Knaak et al. 2004). Parameters specific to each carbamate are needed in the construction of PBPK/PD models along with their metabolic pathways. Parameters may be obtained by (1) development of QSAR models, (2) collecting pharmacokinetic data, and (3) determining pharmacokinetic parameters by fitting to experimental data. The biological parameters are given in Table 1 (Blancato et al. 2000). Table 1 Biological Parameters Required for Carbamate Pesticide Physiologically Based Pharmacokinetic/Pharmacodynamic (PBPK/PD) Models.(a).
The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect. In this paper, several of the major research activities being sponsored by Environmental Protection Agency's National Center for Computational Toxicology are highlighted. Potential links between research in computational toxicology and human exposure science are identified. As with the traditional approaches for toxicity testing and hazard assessment, exposure science is required to inform design and interpretation of high-throughput assays. In addition, common themes inherent throughout National Center for Computational Toxicology research activities are highlighted for emphasis as exposure science advances into the 21st century.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.