Blood calcification propensity was independently associated with the primary composite end point, all-cause mortality, MI, and PVE in the EVOLVE study and improved risk prediction. Prospective trials should clarify whether T-guided therapies improve outcomes.
Calciprotein particle maturation time (T 50 ) in serum is a novel measure of individual blood calcification propensity. To determine the clinical relevance of T 50 in renal transplantation, baseline serum T 50 was measured in a longitudinal cohort of 699 stable renal transplant recipients and the associations of T 50 with mortality and graft failure were analyzed over a median follow-up of 3.1 years. Predictive value of T 50 was assessed for patient survival with reference to traditional (Framingham) risk factors and the calcium-phosphate product. Serum magnesium, bicarbonate, albumin, and phosphate levels were the main determinants of T 50 , which was independent of renal function and dialysis vintage before transplant. During follow-up, 81 (12%) patients died, of which 38 (47%) died from cardiovascular causes. Furthermore, 45 (6%) patients developed graft failure. In fully adjusted models, lower T 50 values were independently associated with increased all-cause mortality (hazard ratio, 1.43; 95% confidence interval, 1.11 to 1.85; P=0.006 per SD decrease) and increased cardiovascular mortality (hazard ratio, 1.55; 95% confidence interval, 1.04 to 2.29; P=0.03 per SD decrease). In addition to age, sex, and eGFR, T 50 improved prognostication for all-cause mortality, whereas traditional risk factors or calcium-phosphate product did not. Lower T 50 was also associated with increased graft failure risk. The associations of T 50 with mortality and graft failure were confirmed in an independent replication cohort. In conclusion, reduced serum T 50 was associated with increased risk of all-cause mortality, cardiovascular mortality, and graft failure and, of all tested parameters, displayed the strongest association with all-cause mortality in these transplant recipients.
Abstract11 -Hydroxysteroid dehydrogenase type 1 (11 -HSD1), catalyzing the intracellular activation of cortisone to cortisol, is currently considered a promising target to treat patients with metabolic syndrome; hence, there is considerable interest in the development of selective inhibitors. For preclinical tests of such inhibitors, the characteristics of 11 -HSD1 from the commonly used species have to be known. Therefore, we determined differences in substrate affinity and inhibitor effects for 11 -HSD1 from six species. The differences in catalytic activities with cortisone and 11-dehydrocorticosterone were rather modest. Human, hamster and guinea-pig 11 -HSD1 displayed the highest catalytic efficiency in the oxoreduction of cortisone, while mouse and rat showed intermediate and dog the lowest activity. Murine 11 -HSD1 most efficiently reduced 11-dehydrocorticosterone, while the enzyme from dog showed lower activity than those from the other species. 7-Ketocholesterol (7KC) was stereospecifically converted to 7 -hydroxycholesterol by recombinant 11 -HSD1 from all species analyzed except hamster, which showed a slight preference for the formation of 7 -hydroxycholesterol. Importantly, guinea-pig and canine 11 -HSD1 displayed very low 7-oxoreductase activities. Furthermore, we demonstrate significant species-specific variability in the potency of various 11 -HSD1 inhibitors, including endogenous compounds, natural chemicals and pharmaceutical compounds. The results suggest significant differences in the three-dimensional organization of the hydrophobic substrate-binding pocket of 11 -HSD1, and they emphasize that species-specific variability must be considered in the interpretation of results obtained from different animal experiments. The assessment of such differences, by cell-based test systems, may help to choose the appropriate animal for safety and efficacy studies of novel potential drug candidates.
Conservation scientists, national governments, and international conservation groups seek to devise, and implement, governance strategies that mitigate human impact on the environment. However, few studies to date have systematically investigated the performance of different systems of governance in achieving successful conservation outcomes. Here, we use a newly-developed analytic framework to conduct analyses of a suite of case studies, linking different governance strategies to standardized scores for delivering ecosystem services, achieving sustainable use of natural resources, and conserving biodiversity, at both local and international levels. Our results: (
i
) confirm the benefits of adaptive management; and (
ii
) reveal strong associations for the role of leadership. Our work provides a critical step toward implementing empirically justified governance strategies that are capable of improving the management of human-altered environments, with benefits for both biodiversity and people.
BackgroundDiuretics are among the most commonly prescribed medications and, due to their mechanisms of action, electrolyte disorders are common side effects of their use. In the present work we investigated the associations between diuretics being taken and the prevalence of electrolyte disorders on admission as well as the impact of electrolyte disorders on patient outcome.MethodsIn this cross sectional analysis, all patients presenting between 1 January 2010 and 31 December 2011 to the emergency room (ER) of the Inselspital, University Hospital Bern, Switzerland were included. Data on diuretic medication, baseline characteristics and laboratory data including electrolytes and renal function parameters were obtained from all patients. A multivariable logistic regression model was performed to assess the impact of factors on electrolyte disorders and patient outcome.ResultsA total of 8.5% of patients presenting to the ER used one diuretic, 2.5% two, and 0.4% three or four. In all, 4% had hyponatremia on admission and 12% hypernatremia. Hypokalemia was present in 11% and hyperkalemia in 4%. All forms of dysnatremia and dyskalemia were more common in patients taking diuretics. Loop diuretics were an independent risk factor for hypernatremia and hypokalemia, while thiazide diuretics were associated with the presence of hyponatremia and hypokalemia. In the Cox regression model, all forms of dysnatremia and dyskalemia were independent risk factors for in hospital mortality.ConclusionsExisting diuretic treatment on admission to the ER was associated with an increased prevalence of electrolyte disorders. Diuretic therapy itself and disorders of serum sodium and potassium were risk factors for an adverse outcome.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.