African Americans have a disproportionate burden of aggressive young-onset breast cancer. Genomic testing for inherited predisposition to breast cancer is increasingly common in clinical practice, but comprehensive mutation profiles remain unknown for most minority populations. We evaluated 289 patients who self-identified as African American with primary invasive breast cancer and with personal or family cancer history or tumor characteristics associated with high genetic risk for all classes of germline mutations in known breast cancer susceptibility genes using a validated targeted capture and multiplex sequencing approach. Sixty-eight damaging germline mutations were identified in 65 (22 %, 95 % CI 18–28 %) of the 289 subjects. Proportions of patients with unequivocally damaging mutations in a breast cancer gene were 26 % (47/180; 95 % confident interval [CI] 20–33 %) of those with breast cancer diagnosis before age 45; 25 % (26/103; 95 % CI 17–35 %) of those with triple-negative breast cancer (TNBC); 29 % (45/156; 95 % CI 22–37 %) of those with a first or second degree relative with breast cancer before age 60 or with ovarian cancer; and 57 % (4/7; 95 % CI 18–90 %) of those with both breast and ovarian cancer. Of patients with mutations, 80 % (52/65) carried mutations in BRCA1 and BRCA2 genes and 20 % (13/65) carried mutations in PALB2, CHEK2, BARD1, ATM, PTEN, or TP53. The mutational allelic spectrum was highly heterogeneous, with 57 different mutations in 65 patients. Of patients meeting selection criteria other than family history (i.e., with young age at diagnosis or TNBC), 48 % (64/133) had very limited information about the history of cancer in previous generations of their families. Mutations in BRCA1 and BRCA2 or another breast cancer gene occur in one in four African American breast cancer patients with early onset disease, family history of breast or ovarian cancer, or TNBC. Each of these criteria defines patients who would benefit from genomic testing and novel therapies targeting DNA repair pathways.
Background Risk factors for therapy-related leukemia (TRL) development, an often lethal late complication of cytotoxic therapy, remain poorly understood and may differ for survivors of different malignancies. Breast cancer (BC) survivors now account for the majority of TRL cases, making study of TRL risk factors in this population a priority. Methods Patients with TRL following cytotoxic therapy for a primary BC were identified from The University of Chicago TRL registry. Those with an available germline DNA sample were screened with a comprehensive gene panel covering known inherited BC susceptibility genes. Clinical and TRL characteristics of all subjects and those with identified germline mutations are described. Results Nineteen (22%) of 88 BC survivors with TRL had an additional primary cancer and 40 (57%) of the 70 with available family history had a close relative with breast, ovarian, or pancreatic cancer. Of the 47 subjects with available DNA, 10 (21%) were found to carry a deleterious inherited mutation in: BRCA1 (n=3, 6%), BRCA2 (n=2, 4%), TP53 (n=3, 6%), CHEK2 (n=1, 2%), and PALB2 (n=1, 2%). Conclusions BC survivors with TRL have personal and family histories suggestive of inherited cancer susceptibility and frequently carry germline mutations in BC susceptibility genes. These data support the role of these genes in TRL risk and suggest that long term follow-up studies of women with germline mutations treated for BC and functional studies of the effects of heterozygous mutations in these genes on bone marrow function following cytotoxic exposures are warranted.
BackgroundIn‐hospital cardiac arrest (IHCA) is a major public health problem with significant mortality. A better understanding of where IHCA occurs in hospitals (intensive care unit [ICU] versus monitored ward [telemetry] versus unmonitored ward) could inform strategies for reducing preventable deaths.Methods and ResultsThis is a retrospective study of adult IHCA events in the Get with the Guidelines—Resuscitation database from January 2003 to September 2010. Unadjusted analyses were used to characterize patient, arrest, and hospital‐level characteristics by hospital location of arrest (ICU versus inpatient ward). IHCA event rates and outcomes were plotted over time by arrest location. Among 85 201 IHCA events at 445 hospitals, 59% (50 514) occurred in the ICU compared to 41% (34 687) on the inpatient wards. Compared to ward patients, ICU patients were younger (64±16 years versus 69±14; P<0.001) and more likely to have a presenting rhythm of ventricular tachycardia/ventricular fibrillation (21% versus 17%; P<0.001). In the ICU, mean event rate/1000 bed‐days was 0.337 (±0.215) compared with 0.109 (±0.079) for telemetry wards and 0.134 (±0.098) for unmonitored wards. Of patients with an arrest in the ICU, the adjusted mean survival to discharge was 0.140 (0.037) compared with the unmonitored wards 0.106 (0.037) and telemetry wards 0.193 (0.074). More IHCA events occurred in the ICU compared to the inpatient wards and there was a slight increase in events/1000 patient bed‐days in both locations.ConclusionsSurvival rates vary based on location of IHCA. Optimizing patient assignment to unmonitored wards versus telemetry wards may contribute to improved survival after IHCA.
Rationale: Two distinct phenotypes of acute respiratory distress syndrome (ARDS) with differential clinical outcomes and responses to randomly assigned treatment have consistently been identified in randomized controlled trial cohorts using latent class analysis. Plasma biomarkers, key components in phenotype identification, currently lack point-of-care assays and represent a barrier to the clinical implementation of phenotypes.Objectives: The objective of this study was to develop models to classify ARDS phenotypes using readily available clinical data only.Methods: Three randomized controlled trial cohorts served as the training data set (ARMA [High vs. Low VT], ALVEOLI [Assessment of Low VT and Elevated End-Expiratory Pressure to Obviate Lung Injury], and FACTT [Fluids and Catheter Treatment Trial]; n = 2,022), and a fourth served as the validation data set (SAILS [Statins for Acutely Injured Lungs from Sepsis]; n = 745). A gradientboosted machine algorithm was used to develop classifier models using 24 variables (demographics, vital signs, laboratory, and respiratory variables) at enrollment. In two secondary analyses, the ALVEOLI and FACTT cohorts each, individually, served as the validation data set, and the remaining combined cohorts formed the training data set for each analysis. Model performance was evaluated against the latent class analysis-derived phenotype.Measurements and Main Results: For the primary analysis, the model accurately classified the phenotypes in the validation cohort (area under the receiver operating characteristic curve [AUC], 0.95; 95% confidence interval [CI], 0.94-0.96). Using a probability cutoff of 0.5 to assign class, inflammatory biomarkers (IL-6, IL-8, and sTNFR-1; P , 0.0001) and 90-day mortality (38% vs. 24%; P = 0.0002) were significantly higher in the hyperinflammatory phenotype as classified by the model. Model accuracy was similar when ALVEOLI (AUC, 0.94; 95% CI, 0.92-0.96) and FACTT (AUC, 0.94; 95% CI, 0.92-0.95) were used as the validation cohorts. Significant treatment interactions were observed with the clinical classifier model-assigned phenotypes in both ALVEOLI (P = 0.0113) and FACTT (P = 0.0072) cohorts.Conclusions: ARDS phenotypes can be accurately identified using machine learning models based on readily available clinical data and may enable rapid phenotype identification at the bedside.
In chronic hypersensitivity pneumonitis (CHP), lack of improvement or declining lung function may prompt use of immunosuppressive therapy. We hypothesised that use of azathioprine or mycophenolate mofetil with prednisone reduces adverse events and lung function decline, and improves transplant-free survival.Patients with CHP were identified. Demographic features, pulmonary function tests, incidence of treatment-emergent adverse events (TEAEs) and transplant-free survival were characterised, compared and analysed between patients stratified by immunosuppressive therapy. A multicentre comparison was performed across four independent tertiary medical centres.Among 131 CHP patients at the University of Chicago medical centre (Chicago, IL, USA), 93 (71%) received immunosuppressive therapy, and had worse baseline forced vital capacity (FVC) and diffusing capacity, and increased mortality compared with those who did not. Compared to patients treated with prednisone alone, TEAEs were 54% less frequent with azathioprine therapy (p=0.04) and 66% less frequent with mycophenolate mofetil (p=0.002). FVC decline and survival were similar between treatment groups. Analyses of datasets from four external tertiary medical centres confirmed these findings.CHP patients who did not receive immunosuppressive therapy had better survival than those who did. Use of mycophenolate mofetil or azathioprine was associated with a decreased incidence of TEAEs, and no difference in lung function decline or survival when compared with prednisone alone. Early transition to mycophenolate mofetil or azathioprine may be an appropriate therapeutic approach in CHP, but more studies are needed.
Introduction Vaccination programs aim to control the COVID-19 pandemic. However, the relative impacts of vaccine coverage, effectiveness, and capacity in the context of nonpharmaceutical interventions such as mask use and physical distancing on the spread of SARS-CoV-2 are unclear. Our objective was to examine the impact of vaccination on the control of SARS-CoV-2 using our previously developed agent-based simulation model. Methods We applied our agent-based model to replicate COVID-19-related events in 1) Dane County, Wisconsin; 2) Milwaukee metropolitan area, Wisconsin; 3) New York City (NYC). We evaluated the impact of vaccination considering the proportion of the population vaccinated, probability that a vaccinated individual gains immunity, vaccination capacity, and adherence to nonpharmaceutical interventions. We estimated the timing of pandemic control, defined as the date after which only a small number of new cases occur. Results The timing of pandemic control depends highly on vaccination coverage, effectiveness, and adherence to nonpharmaceutical interventions. In Dane County and Milwaukee, if 50% of the population is vaccinated with a daily vaccination capacity of 0.25% of the population, vaccine effectiveness of 90%, and the adherence to nonpharmaceutical interventions is 60%, controlled spread could be achieved by June 2021 versus October 2021 in Dane County and November 2021 in Milwaukee without vaccine. Discussion In controlling the spread of SARS-CoV-2, the impact of vaccination varies widely depending not only on effectiveness and coverage, but also concurrent adherence to nonpharmaceutical interventions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.