The mitochondrial DNA (mtDNA) encompasses two classes of functionally important sequence variants: recent pathogenic mutations and ancient adaptive polymorphisms. To rapidly and cheaply evaluate both classes of single nucleotide variants (SNVs), we have developed an integrated system in which mtDNA SNVs are analyzed by multiplex primer extension using the SNaPshot system. A multiplex PCR amplification strategy was used to amplify the entire mtDNA, a computer program identifies optimal extension primers, and a complete global haplotyping system is also proposed. This system genotypes SNVs on multiplexed mtDNA PCR products or directly from enriched mtDNA samples and can quantify heteroplasmic variants down to 0.8% using a standard curve. With this system, we have developed assays for testing the common pathogenic mutations in four multiplex panels: two genotype the 13 most common pathogenic mtDNA mutations and two genotype the 10 most common Leber Hereditary Optic Neuropathy mutations along with haplogroups J and T. We use a hierarchal system of 140 SNVs to delineate the major global mtDNA haplogroups based on a global phylogenetic tree of coding region polymorphisms. This system should permit rapid and inexpensive genotyping of pathogenic and lineage-specific mtDNA SNVs by clinical and research laboratories.
ClinicalTrials.gov, #NCT00223171), the aim of this analysis was to determine if testosterone level (TL) measured at the end of androgen deprivation therapy (ADT) could predict outcomes in intermediate risk (IR) and high risk (HR) prostate cancer patients treated with ADT and radiotherapy (RT). Materials/Methods: Between October 2000 and September 2010, 1030 patients with IR (400) and HR (630) prostate cancer were treated with a combination of RT and ADT. IR patients received short term ADT with Bicalutamide and Goserelin during 6 months and HR patients received either 36 or 18 months of ADT (Bicalutamide during the first month plus Goserelin for 36 vs. 18 months). TL was available in 796 patients (77%) at the end of ADT. Castration was defined as testosterone level below 1.7 nmol/L (<50 ng/dL). Patients were distributed within two castrate strata (lower stratum: testosterone <0.7 nmol/L and an upper stratum: testosterone between 0.7 and 1.7 nmol/L) or a non-castrate testosterone level (testosterone >1.7 nmol/L) and outcomes were compared between the 3 groups. Patient's characteristics were compared with ANOVA and Fisher's exact test. Rates of biochemical failure, prostate cancer progression, resistance to castration and death were estimated with Kaplan-Meier. Multivariate Cox regressions were used to model testosterone effect and to include risk level (IR vs HR) and unbalanced patient's characteristics as covariables. Results: Patient's characteristics (age, zubrod, PSA, stage, ADT duration) were well balanced between the 3 groups. At the end of ADT 50.8% (404/ 796) presented testosterone levels 0.7 nmol/L and 40.3% (321/796) levels of >0.7 and <1.7 nmol/L for a total of 91.1% (725/796) reaching castrate levels <1.7 nmol/L. In 8.9% of patients (71/796), a castrate testosterone level was not achieved. Duration of ADT and disease risk stratification had no bearing on testosterone castrate level at the end of ADT. With a median follow-up of 9.15 years (range: 7.15-11.15), outcomes are shown below: Conclusion: At the end of ADT, the majority of patients (91.1%) achieves a castrate level of testosterone <1.7 nmol/L regardless of ADT duration. There was no significant difference in outcomes between patients achieving a TL <0.7 nmol/L to those with TL between >0.7 and <1.7 nmol/L. Although not statistically significant, likely because of the small number of patients, a non-castrate testosterone level was associated with worse outcomes Our results do not support the idea that better outcomes are associated with testosterone below 0.7 nmol/L.
(DT), number of trees (RF), and learning rate/tree level/splits (BT). We then developed a treatment support system to predict toxicity risk to SBRT versus CF-RT. Results: Of the 93 SBRT patients, 16 (17.2%) had Grade 2 late (LT) GU and GI toxicity, but only 3 (3%) had acute Grade 2 toxicity, thus we focused on LT toxicity. Using DT we were able to predict LT GI and GU toxicity with a sensitivity of >96%, specificity >88%, AUC >93%, accuracy >95%, PPV >99%, NPV> 78%, and an F1 score of >97%. There was little overlap between biomarkers predicting LT GI verses GU toxicity for SBRT, and little overlap between biomarkers predicting LT GI or GU toxicity to SBRT versus CF-RT. Based on a patient's biomarker profile we performed toxicity prediction for each regimen (SBRT versus CF-RT) in parallel and assessed the likelihood of toxicity associated with each option. We could stratify patients into 3 subgroups: a subcohort whose toxicity outcome (either positive or negative) is indifferent to the treatment regimen (w85%), a cohort where SBRT is safer than CF-RT (w5%), and a cohort where CF-RT is safer than SBRT (w10%). Conclusion: We have identified a microRNA-based germ-line biomarker signature that is able to predict LT GI and GU toxicity to SBRT, which is different than that predicting LT toxicity to CF-RT, and developed a classifier that can stratify patients based on the risk for toxicity, possibly guiding treatment selection. Ongoing work is being done to further validate these findings prospectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.