Verification of candidate biomarkers relies upon specific, quantitative assays optimized for selective detection of target proteins, and is increasingly viewed as a critical step in the discovery pipeline that bridges unbiased biomarker discovery to preclinical validation. Although individual laboratories have demonstrated that multiple reaction monitoring (MRM) coupled with isotope dilution mass spectrometry can quantify candidate protein biomarkers in plasma, reproducibility and transferability of these assays between laboratories have not been demonstrated. We describe a multilaboratory study to assess reproducibility, recovery, linear dynamic range and limits of detection and quantification of multiplexed, MRM-based assays, conducted by NCI-CPTAC. Using common materials and standardized protocols, we demonstrate that these assays can be highly reproducible within and across laboratories and instrument platforms, and are sensitive to low µg/ml protein concentrations in unfractionated plasma. We provide data and benchmarks against which individual laboratories can compare their performance and evaluate new technologies for biomarker verification in plasma.Proteomic technologies based on mass spectrometry (MS) have emerged as preferred components of a strategy for discovery of diagnostic, prognostic and therapeutic protein biomarkers. Because of the stochastic sampling of proteomes in unbiased analyses and the associated high false-discovery rate, tens to hundreds of potential biomarkers are often reported in discovery studies. Those few that will ultimately show sufficient sensitivity and specificity for a given medical condition must thus be culled from lengthy lists of candidates -a particularly challenging aspect of the biomarker-development pipeline and currently its main limiting step. In this context, it is highly desirable to verify, by more targeted quantitative methods, the levels of candidate biomarkers in body fluids, cells, tissues or organs from healthy individuals and affected patients in large enough sample numbers to confirm statistically relevant differences 1, 2. Verification of novel biomarkers has relied primarily on the use of sensitive, specific, high-throughput immunoassays, whose development depends critically on the availability of suitable well-characterized antibodies. However, antibody reagents of sufficient specificity and sensitivity to assay novel protein biomarkers in plasma are generally not available. The high cost and long development time required to generate high-quality immunoassay reagents, as well as technical limitations in multiplexing immunoassays for panels of biomarkers, is strong motivation to develop more straightforward quantitative approaches exploiting the sensitivity and molecular specificity of mass spectrometry.Recently, multiple reaction monitoring (MRM) coupled with stable isotope dilution (SID)-MS for direct quantification of proteins in cell lysates as well as human plasma and serum has been shown to have considerable promise 3- RESULTS Study de...
A B S T R A C T PurposeEarly detection of ovarian cancer has great promise to improve clinical outcome. Patients and MethodsNinety-six serum biomarkers were analyzed in sera from healthy women and from patients with ovarian cancer, benign pelvic tumors, and breast, colorectal, and lung cancers, using multiplex xMAP bead-based immunoassays. A Metropolis algorithm with Monte Carlo simulation (MMC) was used for analysis of the data. ResultsA training set, including sera from 139 patients with early-stage ovarian cancer, 149 patients with late-stage ovarian cancer, and 1,102 healthy women, was analyzed with MMC algorithm and cross validation to identify an optimal biomarker panel discriminating early-stage cancer from healthy controls. The four-biomarker panel providing the highest diagnostic power of 86% sensitivity (SN) for early-stage and 93% SN for late-stage ovarian cancer at 98% specificity (SP) was comprised of CA-125, HE4, CEA, and VCAM-1. This model was applied to an independent blinded validation set consisting of sera from 44 patients with early-stage ovarian cancer, 124 patients with late-stage ovarian cancer, and 929 healthy women, providing unbiased estimates of 86% SN for stage I and II and 95% SN for stage III and IV disease at 98% SP. This panel was selective for ovarian cancer showing SN of 33% for benign pelvic disease, SN of 6% for breast cancer, SN of 0% for colorectal cancer, and SN of 36% for lung cancer. ConclusionA panel of CA-125, HE4, CEA, and VCAM-1, after additional validation, could serve as an initial stage in a screening strategy for epithelial ovarian cancer.
There is evidence that warming leads to greater evapotranspiration and surface drying, thus contributing to increasing intensity and duration of drought and implying that mitigation would reduce water stresses. However, understanding the overall impact of climate change mitigation on water resources requires accounting for the second part of the equation, i.e., the impact of mitigationinduced changes in water demands from human activities. By using integrated, high-resolution models of human and natural system processes to understand potential synergies and/or constraints within the climate-energy-water nexus, we show that in the United States, over the course of the 21st century and under one set of consistent socioeconomics, the reductions in water stress from slower rates of climate change resulting from emission mitigation are overwhelmed by the increased water stress from the emissions mitigation itself. The finding that the human dimension outpaces the benefits from mitigating climate change is contradictory to the general perception that climate change mitigation improves water conditions. This research shows the potential for unintended and negative consequences of climate change mitigation.climate change | mitigation | water deficit | Earth system model | integrated assessment E arlier work addressing the impact of emissions mitigation on water supply and demand has produced conflicting results (1-5). The reasons are complex. Earth system models (ESMs) and climate models are generally in agreement that a lack of climate change mitigation would lead to greater warming and intensification of the global water cycle (6), increasing precipitation intensity (7), changes in runoff that amplify the existing wet/dry patterns (8), and increasing flood risk (9) as well as aridity (10). However, changes in seasonal patterns and the increasing probability of extreme events may complicate the general patterns of wet/dry trends (11). Additionally, changes in water demands caused by socioeconomic drivers alone may surpass the effects of climate change on water availability (12). Several studies (1-5) have assessed the consequences of mitigation on some measure of water deficit. Each study used its own integrated assessment and global hydrologic models, generally with varying underlying socioeconomic and technological assumptions, climate inputs, measures of water deficit, and a wide range of spatial and temporal resolutions. A key distinction of the study presented here is its coupling of regional ESMs and human systems models using finer spatial and/or temporal resolutions than previous efforts.Extending the work of Hejazi et al. (4) and Voisin et al. (13), integrated regional models of human and natural systems with enhanced capabilities are used at high temporal and spatial resolution while maintaining consistency with regional and global climate and economic modeling. In this modeling framework, a regional integrated assessment model (IAM) simulates water demand for both irrigation and nonirrigation sectors (a resu...
a b s t r a c tThis paper presents the results of numerous commercial and residential building simulations, with the purpose of examining the impact of climate change on peak and annual building energy consumption over the portion of the EIC (Eastern Interconnection) located in the United States. The climate change scenario considered includes changes in mean climate characteristics as well as changes in the frequency and duration of intense weather events. Simulations were performed using the BEND (Building ENergy Demand) model which is a detailed building analysis platform utilizing EnergyPlus™ as the simulation engine. Over 26,000 building configurations of different types, sizes, vintages, and characteristics representing the population of buildings within the EIC, are modeled across the three EIC time zones using the future climate from 100 target region locations, resulting in nearly 180,000 spatially relevant simulated demand profiles for three years selected to be representative of the general climate trend over the century. This approach provides a heretofore unprecedented level of specificity across multiple spectrums including spatial, temporal, and building characteristics. This capability enables the ability to perform detailed hourly impact studies of building adaptation and mitigation strategies on energy use and electricity peak demand within the context of the entire grid and economy.
This paper describes a computerized clavicle identification system primarily designed to resolve the identities of unaccounted-for U.S. soldiers who fought in the Korean War. Elliptical Fourier analysis is used to quantify the clavicle outline shape from skeletons and postero-anterior antemortem chest radiographs to rank individuals in terms of metric distance. Similar to leading fingerprint identification systems, shortlists of the top matching candidates are extracted for subsequent human visual assessment. Two independent tests of the computerized system using 17 field-recovered skeletons and 409 chest radiographs demonstrate that true-positive matches are captured within the top 5% of the sample 75% of the time. These results are outstanding given the eroded state of some field-recovered skeletons and the faintness of the 1950's photofluorographs. These methods enhance the capability to resolve several hundred cold cases for which little circumstantial information exists and current DNA and dental record technologies cannot be applied.
This paper investigates an ensemble-based technique called Bayesian Model Averaging (BMA) to improve the performance of protein amino acid pKa predictions. Structure-based pKa calculations play an important role in the mechanistic interpretation of protein structure and are also used to determine a wide range of protein properties. A diverse set of methods currently exist for pKa prediction, ranging from empirical statistical models to ab initio quantum mechanical approaches. However, each of these methods are based on a set of conceptual assumptions that can effect a model’s accuracy and generalizability for pKa prediction in complicated biomolecular systems. We use BMA to combine eleven diverse prediction methods that each estimate pKa values of amino acids in staphylococcal nuclease. These methods are based on work conducted for the pKa Cooperative and the pKa measurements are based on experimental work conducted by the García-Moreno lab. Our cross-validation study demonstrates that the aggregated estimate obtained from BMA outperforms all individual prediction methods with improvements ranging from 45-73% over other method classes. This study also compares BMA’s predictive performance to other ensemble-based techniques and demonstrates that BMA can outperform these approaches with improvements ranging from 27-60%. This work illustrates a new possible mechanism for improving the accuracy of pKa prediction and lays the foundation for future work on aggregate models that balance computational cost with prediction accuracy.
Numerical ensemble forecasting is a powerful tool that drives many risk analysis efforts and decision making tasks. These ensembles are composed of individual simulations that each uniquely model a possible outcome for a common event of interest: e.g., the direction and force of a hurricane, or the path of travel and mortality rate of a pandemic. This paper presents a new visual strategy to help quantify and characterize a numerical ensemble's predictive uncertainty: i.e., the ability for ensemble constituents to accurately and consistently predict an event of interest based on ground truth observations. Our strategy employs a Bayesian framework to first construct a statistical aggregate from the ensemble. We extend the information obtained from the aggregate with a visualization strategy that characterizes predictive uncertainty at two levels: at a global level, which assesses the ensemble as a whole, as well as a local level, which examines each of the ensemble's constituents. Through this approach, modelers are able to better assess the predictive strengths and weaknesses of the ensemble as a whole, as well as individual models. We apply our method to two datasets to demonstrate its broad applicability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.