Traumatic brain injury (TBI)-related hypopituitarism has been recognized as a clinical entity for more than a century, with the first case being reported in 1918. However, during the 20th century hypopituitarism was considered only a rare sequela of TBI. Since 2000 several studies strongly suggest that TBI-mediated pituitary hormones deficiency may be more frequent than previously thought. Growth hormone deficiency (GHD) is the most common abnormality, followed by hypogonadism, hypothyroidism, hypocortisolism, and diabetes insipidus. The pathophysiological mechanisms underlying pituitary damage in TBI patients include a primary injury that may lead to the direct trauma of the hypothalamus or pituitary gland; on the other hand, secondary injuries are mainly related to an interplay of a complex and ongoing cascade of specific molecular/biochemical events. The available data describe the importance of GHD after TBI and its influence in promoting neurocognitive and behavioral deficits. The poor outcomes that are seen with long standing GHD in post TBI patients could be improved by GH treatment, but to date literature data on the possible beneficial effects of GH replacement therapy in post-TBI GHD patients are currently scarce and fragmented. More studies are needed to further characterize this clinical syndrome with the purpose of establishing appropriate standards of care. The purpose of this review is to summarize the current state of knowledge about post-traumatic GH deficiency.
Introduction: The diagnosis of growth hormone deficiency (GHD) in adults is based on a reduced GH response to provocative tests, such as the insulin tolerance test (ITT) and the GH-releasing hormone (GHRH) + arginine (ARG) test. However, the cut-off limits of peak GH response in lean subjects are not reliable in obese patients; this is noteworthy since adult GHD is often associated with obesity. To date, there are no ITT cut-offs related to body mass index (BMI). Objective: We aimed to evaluate the diagnostic cut-offs of GH response to the ITT in the function of BMI. Methods: The GH response to the ITT was studied in 106 patients with a history of hypothalamic-pituitary disease, a mean age of 48.2 ± 12.4 years, and a mean BMI of 26.8 ± 6.1 kg/m2). Patients were divided into lean, overweight, and obese groups according to their BMI. The lack of GH response to GHRH + ARG test was considered the gold standard for the diagnosis of GHD. The best GH cut-off in the ITT, defined as the one with the best sensitivity (SE) and specificity (SP), was identified using receiver-operating characteristics curve (ROC) analysis. Results: The best GH cut-off in the ITT was 3.5 μg/L in lean subjects (SE 82.1%; SP 85.7%), 1.3 μg/L in overweight subjects (SE 74.1%; SP 85.7%), and 2.2 μg/L in obese subjects (SE 90.0%; SP 50.0%). The diagnostic accuracy was 97.2, 76.5, and 76.7%, respectively. Conclusions: Our data show that the ITT represents a reliable diagnostic tool for the diagnosis of adult GHD in lean subjects if an appropriate cut-off limit is assumed. Overweight and obesity strongly reduce the GH response to the ITT, GH BMI-related cut-off limits, and the diagnostic reliability of the test.
Introduction Impulse control disorders (ICDs) have been described as a side effect of dopamine agonists (DAs) in neurological as well as endocrine conditions. Few studies have evaluated the neuropsychological effect of DAs in hyperprolactinemic patients, and these have reported a relationship between DAs and ICDs. Our objective was to screen for ICD symptoms in individuals with DA-treated endocrine conditions. Materials and methods A cross-sectional analysis was conducted on 132 patients with pituitary disorders treated with DAs (DA exposed), as well as 58 patients with pituitary disorders and no history of DA exposure (non-DA exposed). Participants responded to the full version of the Questionnaire for Impulsive-Compulsive Disorders in Parkinson’s disease (QUIP). Results Compared with the non-DA-exposed group, a higher prevalence of DA-exposed patients tested positive for symptoms of any ICD or related behavior (52% vs. 31%, p < 0.01), any ICD (46% vs. 24%, p < 0.01), any related behavior (31% vs. 17%, p < 0.05), compulsive sexual behavior (27% vs. 14%, p < 0.04), and punding (20% vs. 7%, p < 0.02) by QUIP. On univariate analysis, DA treatment was associated with a two- to threefold increased risk of any ICD or related behavior [odds ratio (OR) 2.43] and any ICD (OR 2.70). In a multivariate analysis, independent risk factors for any ICD or related behavior were DA use (adjusted OR 2.22) and age (adjusted OR 6.76). Male gender was predictive of the risk of hypersexuality (adjusted OR 3.82). Discussion Despite the QUIP limitations, a clear sign of increased risk of ICDs emerges in individuals with DA-treated pituitary disorders. Our data contribute to the growing evidence of DA-induced ICDs in endocrine conditions.
Introduction According to guidelines, a morning serum cortisol level <83 nmol/L is diagnostic for central adrenal insufficiency (CAI), a value >414 nmol/L excludes CAI, while values between 83 and 414 nmol/L require stimulation tests. However, there are no currently reliable data on morning serum cortisol for prediction of cortisol response to insulin tolerance test (ITT). Objective Using the receiver operating characteristic curve analysis, the purpose of this study was to detect the morning serum cortisol cut-off with a specificity (SP) or a sensitivity (SE) above 95% that identify those patients who should not be tested with ITT. Methods We included 141 adult patients (83 males) aged 42.7+/-12.3 (mean +/- SD) years old. Based on serum cortisol response to ITT, patients have been divided in two groups: subjects with CAI (peak serum cortisol <500 nmol/L; 65 patients) and subjects with preserved adrenocortical function (peak cortisol >500 nmol/L; 76 patients). Results The best morning cortisol cut-off, in terms of SE (87.7%) and SP (46.1%), was ≤323.3 nmol/L. The cut-off of morning serum cortisol concentration that best predicted a deficient response to ITT was ≤126.4 nmol/L (SE 13.8%, SP 98.7%). The cut-off of morning serum cortisol concentration that best predicted a normal response to ITT was >444.7 nmol/L (SE 96.9%, SP 14.5%). Conclusions This is the first study that identifies a morning serum cortisol cut-off that best predict the response to ITT in order to simplify the diagnostic process in patients with suspected CAI. A new diagnostic flow chart for CAI is proposed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.