Accurate assessment of kidney function is an important component of determining appropriate drug dosing regimens. Nearly all manufacturer-recommended dosage adjustments are based on creatinine clearance ranges derived from clinical pharmacokinetic studies performed during the drug development process. The Cockcroft-Gault (CG) equation provides an estimate of creatinine clearance and is the equation most commonly used to determine drug dosages in patients with impaired kidney function. The Modification of Diet in Renal Disease (MDRD) study equation has also been proposed for this purpose. Published studies report that drug dosages determined by the two equations do not agree in 10-40% of cases. However, interpretation and comparison of these studies are complicated by the variable creatinine methods used for calculating CG and MDRD estimates, the patient populations studied, and a lack of outcomes data demonstrating the clinical significance of dosing discrepancies. Moreover, the impact of reporting standardized serum creatinine values on the accuracy of the CG equation and corresponding drug dosing regimens have been questioned. Currently, no prospective pharmacokinetic studies have been conducted with use of the MDRD equation to generate dosing recommendations, and limited data are available to support its use in some patient populations representing demographic extremes. Collectively, these issues have resulted in considerable confusion among clinicians and have fueled a healthy debate on whether or not to use the MDRD equation to determine drug dosages. Each of these issues is reviewed, and a proposed algorithm for using creatinine-based kidney function assessments in drug dosing is provided. Knowledge of the advantages, limitations, and clinical role of each equation will facilitate their safe and effective use in drug dosing.
Objective To determine levels of environmental chemotherapy contamination in a new cancer hospital that has exclusively used a closed-system drug transfer device (PhaSeal) for preparing and administering all compatible antineoplastics. Methods After 6 months of operation, surface samples were collected from pharmacy and nursing areas to determine levels of contamination with cyclophosphamide and ifosfamide. In addition, urine samples were collected from pharmacists, pharmacy technicians, and nurses to determine employee exposure to these agents. All samples were analyzed using liquid chromatography/tandem mass spectrometry. Results Twenty-one percent (7/34) of surface samples collected tested positive for cyclophosphamide contamination. Twelve percent (4/34) of surface samples tested positive for ifosfamide. To place this into perspective, historical data collected at our outpatient oncology infusion clinic 6 months after converting to PhaSeal from conventional methods of antineoplastic preparation showed 33% (7/21) and 71% (15/21) of samples tested positive for cyclophosphamide and ifosfamide, respectively. The level of ifosfamide contamination found in samples that tested positive at our new hospital also appeared to be lower than in positive samples at the outpatient infusion clinic. In the current study, the urine of one participant (1/11), a pharmacy technician, tested positive for low levels of cyclophosphamide and ifosfamide. To compare, 71% and 0% of participants tested at the outpatient infusion clinic had positive urine samples prior to and 6 months after implementation of PhaSeal, respectively. Conclusions: Compared with historical levels of contamination in our outpatient oncology infusion clinic, levels of chemotherapy contamination appeared lower. However, some contamination was still present in our new cancer hospital where PhaSeal had been used exclusively.
The Cockcroft-Gault, MDRD, and CKD-EPI equations provide reasonable estimates of kidney function; however, clinicians must understand the limitations when using these estimates for drug regimen design.
Objective. To model the relationship of common pharmacy education assessment data including student demographics, pre-pharmacy performance, core didactic performance, and external testing measures to identify predictors of student readiness for advanced pharmacy practice experiences (APPEs). Methods. The associations between 23 predictive covariates from 226 graduating students from 2015-2018 (5786 observations) and APPE readiness as measured by midpoint core APPE scores were modeled. Multiple linear and Poisson regression models with backward selection were used. A selection criterion of p ..10 was used for covariate elimination from the model. Three models were evaluated: average of all midpoint core APPE rotation scores; average of midpoint acute care pharmacy practice and ambulatory care APPE rotation scores; and number of midpoint core clerkship failing scores. Results. The average age of the population at admission was 25.464.5 years, 47% were female, and 75.2% had prior degrees. Across the three prediction models, knowledge-retention covariates were the strongest predictors. Total score on the Pharmacy Curriculum Outcomes Assessment was a modest yet consistent predictor across the models. All other significant predictors were unique to the various models. Conclusion. This four-year, population-based modeling study of the relationship of common pharmacy education assessment data to APPE midpoint scores shows a modest correlation with knowledge-based measures. There is a need for greater innovation in this area of research.
Diabetic ketoacidosis (DKA) is a common condition, with wide variation in admission location and clinical practice. We aimed to decrease intensive care unit (ICU) admission for DKA by implementing a standardized, electronic health record-driven clinical care pathway that used subcutaneous insulin, rather than a continuous insulin infusion, for patients with nonsevere DKA. This is a retrospective, observational preintervention to postintervention study of 214 hospital admissions for DKA that evaluated the effect of our intervention on clinical, safety, and cost outcomes. The primary outcome was ICU admission, which decreased from 67.0% to 41.7% (p < .001). Diabetes nurse educator consultation increased from 45.3% to 63.9% (p = .006), and 30-day Emergency Department (ED) return visit decreased from 12.3% to 2.8% (p = .008). Time to initiation of basal insulin increased from 18.19 ± 1.25 hours to 22.47 ± 1.76 hours (p = .05) and reopening of the anion gap increased from 4.7% to 13.9% (p = .02). No changes in ED length of stay (LOS), hospital LOS, hypoglycemia, treatment-induced hypokalemia, 30-day hospital readmission, or inpatient mortality were observed. The implementation of a standardized DKA care pathway using subcutaneous insulin for nonsevere DKA resulted in decreased ICU use and increased diabetes education, without affecting patient safety.
When using a polyarylethersulfone, polyvinylpyrrolidone, and polyamide high-flux HD membrane with a 24R Polyflux dialyzer, vancomycin can be administered during the last hour of dialysis if the dose that is prescribed for intra-dialysis dosing is empirically increased to account for intra-dialytic drug removal.
IntroductionTenofovir disoproxil fumarate (TDF) has been associated with greater incidences of bone complications, which might be modified by some concomitantly administered antiretrovirals, possibly by their effect on tenofovir concentrations. We compared bone adverse outcomes among treatment-naïve HIV-infected US veterans initiating efavirenz (EFV)-containing TDF/emtricitabine (FTC) regimens versus those initiating non-EFV-containing TDF/FTC regimens.MethodsUsing national Veterans Health Administration clinical and administrative data sets, we identified a cohort of treatment-naïve HIV-infected veterans without bone disease who initiated therapy with TDF/FTC plus EFV, rilpivirine, elvitegravir/cobicistat, or ritonavir-boosted protease inhibitors in 2003–2015. The primary composite adverse bone outcome was the unadjusted incidence rate (IR) of osteoporosis, osteopenia, or fragility fracture (any hip, wrist, or spine fracture). To account for selection bias and confounding, we used inverse probability of treatment-weighted Cox proportional hazards regression models to calculate adjusted hazard ratios (HRs) for each outcome associated with EFV + TDF/FTC versus each non-EFV-containing TDF/FTC regimen.ResultsOf 33,048 HIV-positive veterans, 7161 initiated a TDF/FTC-containing regimen (mean age, 50 years; baseline CD4 < 200 cells/mm3, 33.3%; HIV-1 RNA > 100,000 copies/ml, 22.3%; mean follow-up, 13.0 months). Of these, 4137 initiated EFV- and 3024 non-EFV-containing regimens. Veterans initiating EFV- versus non-EFV-containing TDF/FTC regimens had a lower IR of the composite bone outcome (29.3 vs. 41.4 per 1000 patient-years), with significant risk reductions for this outcome [HR, 0.69; 95% confidence interval (CI), 0.58–0.83] and fragility fracture (HR, 0.59; 95% CI, 0.44–0.78).ConclusionEFV + TDF/FTC is associated with a lower risk of adverse bone outcomes compared with other TDF-containing regimens in the VHA.FundingBristol-Myers Squibb.Electronic supplementary materialThe online version of this article (10.1007/s40121-018-0194-1) contains supplementary material, which is available to authorized users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.