The concept of ‘Successful Aging’ has long intrigued the scientific community. Despite this long-standing interest, a consensus definition has proven to be a difficult task, due to the inherent challenge involved in defining such a complex, multi-dimensional phenomenon. The lack of a clear set of defining characteristics for the construct of successful aging has made comparison of findings across studies difficult and has limited advances in aging research. The domain in which consensus on markers of successful aging is furthest developed is the domain of physical functioning. For example, walking speed appears to be an excellent surrogate marker of overall health and predicts the maintenance of physical independence, a cornerstone of successful aging. The purpose of the present article is to provide an overview and discussion of specific health conditions, behavioral factors, and biological mechanisms that mark declining mobility and physical function and promising interventions to counter these effects. With life expectancy continuing to increase in the United States and developed countries throughout the world, there is an increasing public health focus on the maintenance of physical independence among all older adults.
Objectives Delirium’s adverse effect on long-term mortality in older hospitalized patients is well documented, while its effect in older emergency department (ED) patients remains unclear. Similarly, the consequences of delirium on nursing home patients seen in the ED are also unknown. As a result, we sought to determine if delirium in the ED was independently associated with 6-month mortality in older patients and if this relationship was modified by nursing home status. Methods Our prospective cohort study was conducted at a tertiary care, academic ED using convenience sampling, and included English speaking patients who were 65 years and older and were in the ED for less than 12 hours at the time of enrollment. Patients were excluded if they refused consent, were previously enrolled, were unable to follow simple commands at baseline, were comatose, or had incomplete data. The Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) was used to determine delirium and was administered by trained research assistants. Cox proportional hazard regression was performed to determine if delirium in the ED was independently associated with 6-month mortality after adjusting for age, comorbidity burden, severity of illness, dementia, functional dependence, and nursing home residence. To test whether the effect of delirium in the ED on 6-month mortality was modified by nursing home residence, an interaction term (delirium*nursing home) was incorporated into the multivariable model. Hazard ratios (HR) with their 95% confidence intervals (95% CI) were reported. Results Of the 628 patients enrolled, 108 (17.2%) were delirious in the ED and 58 (9.2%) were from the nursing home. For the entire cohort, the 6-month mortality rate was higher in the delirious group compared to the non-delirious group (37.0% versus 14.3%). Delirium was an independent predictor of increased 6-month mortality (HR = 1.72, 95% CI: 1.04 – 2.86) after adjusting for age, comorbidity burden, severity of illness, dementia, functional dependence, and nursing home residence. The “delirium*nursing home” interaction was non-significant (p=0.86), indicating that place of residence had no effect on the relationship between delirium in the ED and 6-month mortality. Conclusion Delirium in older ED patients is an independent predictor of increased 6-month mortality and this relationship appears to be present regardless of nursing home status.
Background Elderly patients admitted to intensive care units (ICU) are at risk of receiving potentially (PIMs) and actually inappropriate medications (AIMs). Objectives To determine types of PIMs and AIMs, which PIMs are most likely to be considered AIMs, and risk factors for PIMs and AIMs at hospital discharge in elderly ICU survivors. Design Prospective cohort study Setting Tertiary care, academic medical center Participants 120 patients ≥ 60 years old who survived an ICU hospitalization Measurements PIMs were defined according to published criteria; AIMs were adjudicated by a multidisciplinary panel. Medication lists were abstracted at the time of pre-admission, ward admission, Intensive Care Unit (ICU) admission, ICU discharge, and hospital discharge. Poisson regression was used to examine independent risk factors for hospital discharge PIMs and AIMs. Results Of 250 PIMs prescribed at discharge, the most common were opioids (28%), anticholinergics (24%), antidepressants (12%), and drugs causing orthostasis (8%). The three most common AIMs were anticholinergics (37%), non-benzodiazepine hypnotics (14%), and opioids (12%). Overall, 36% of discharge PIMs were classified as AIMs, but the percentage varied by drug type. Whereas only 16% of opioids, 23% of antidepressants, and 10% of drugs causing orthostasis were classified as AIMs; 55% of anticholinergics, 71% of atypical antipyschotics, 67% of non-benzodiazepine hypnotics and benzodiazepines, and 100% of muscle relaxants were deemed AIMs. The majority of PIMs and AIMs were first prescribed in the ICU. Pre-admission PIMs, discharge to somewhere other than home, and discharge from a surgical service predicted number of discharge PIMs, but none of the factors predicted AIMs at discharge. Conclusions Certain types of PIMs, which are commonly initiated in the ICU, are more frequently considered inappropriate upon clinical review. Efforts to reduce AIMs in elderly ICU survivors should target these specific classes of medications.
IMPORTANCE Calcium channel blockers, specifically dihydropyridine calcium channel blockers (DH CCBs, eg, amlodipine), may cause lower-extremity edema. Anecdotal reports suggest this may result in a prescribing cascade, where DH CCB-induced edema is treated with loop diuretics. OBJECTIVE To assess the magnitude and characteristics of the DH CCB prescribing cascade. DESIGN, SETTING, AND PARTICIPANTS This cohort study used a prescription sequence symmetry analysis to assess loop diuretic initiation before and after the initiation of DH CCBs among patients aged 20 years or older without heart failure. Data from a private insurance claims database from 2005 to 2017 was analyzed. Use of loop diuretics associated with initiation of angiotensin-converting enzyme inhibitors, angiotensin receptor blockers, and other commonly used medications was used as negative controls. Data were analyzed from March 2019 through October 2019. EXPOSURES Initiation of DH CCB or negative control medications. MAIN OUTCOMES AND MEASURES The temporality of loop diuretic initiation relative to DH CCB or negative control initiation. Secular trend-adjusted sequence ratios (aSRs) with 95% CIs were calculated using data from 360 days before and after initiation of DH CCBs. RESULTS Among 1 206 093 DH CCB initiators, 55 818 patients (4.6%) (33 100 [59.3%] aged <65 years; 32 916 [59.0%] women) had a new loop diuretic prescription 360 days before or after DH CCB initiation, resulting in an aSR of 1.87 (95% CI, 1.84-1.90). An estimated 1.44% of DH CCB initiators experienced the prescribing cascade. The aSR was disproportionately higher among DH CCB initiators who were prescribed high doses (aSR, 2.20; 95% CI, 2.13-2.27), initiated amlodipine (aSR, 1.89; 95% CI, 1.86-1.93), were men (aSR, 1.96; 95% CI, 1.91-2.01), and used fewer antihypertensive classes (aSR, 2.55; 95% CI, 2.47-2.64). The evaluation of ACE inhibitors or ARBs as negative controls suggested hypertension progression may have tempered the incidence of the prescribing cascade (aSR for ACE inhibitors and ARBs, 1.27; 95% CI, 1.24-1.29). CONCLUSIONS AND RELEVANCE This study found an excessive use of loop diuretics following initiation of DH CCBs that cannot be completely explained by secular trends or hypertension progression. The prescribing cascade was more pronounced among those initially prescribed a high dose of DH CCBs.
Sarcopenia is a debilitating condition that involves loss of muscle mass and function, which affects virtually everyone as they age, and can lead to frailty and ultimately disability. In growing recognition of the importance of both muscle strength and muscle mass relative to body size in contributing to functional decline, recent definitions have now incorporated grip strength and a correction for body mass as part of the key criteria that define sarcopenia. With this new definition, a much larger population of older adults are now at risk of sarcopenia. In the present article, we reviewed the literature for studies which tested the effects of diet or exercise interventions on changes in lean mass and/or functional outcomes in individuals with either sarcopenia and/or frailty and identified 19 clinical trials. There were a few key findings. First, dietary interventions involving protein supplementation improved functional and/or strength outcomes in a few trials; however, other dietary approaches were less effective. Exercise interventions and combined diet and exercise interventions produced consistent improvements in lower body muscle strength but had less consistent effects on walking speed and grip strength. Lifestyle interventions not involving calorie restriction generally did not induce significant changes in body composition. There were a limited number of trials in which participants with sarcopenia were specifically targeted, and thus there is an important need for more research to determine the appropriate types of intervention approaches for the high risk population of sarcopenic older adults.
Background: Inpatient falls, many resulting in injury or death, are a serious problem in hospital settings. Existing falls risk assessment tools, such as the Morse Fall Scale, give a risk score based on a set of factors, but don’t necessarily signal which factors are most important for predicting falls. Artificial intelligence (AI) methods provide an opportunity to improve predictive performance while also identifying the most important risk factors associated with hospital-acquired falls. We can glean insight into these risk factors by applying classification tree, bagging, random forest, and adaptive boosting methods applied to Electronic Health Record (EHR) data. Objective: The purpose of this study was to use tree-based machine learning methods to determine the most important predictors of inpatient falls, while also validating each via cross-validation. Materials and methods: A case-control study was designed using EHR and electronic administrative data collected between January 1, 2013 to October 31, 2013 in 14 medical surgical units. The data contained 38 predictor variables which comprised of patient characteristics, admission information, assessment information, clinical data, and organizational characteristics. Classification tree, bagging, random forest, and adaptive boosting methods were used to identify the most important factors of inpatient fall-risk through variable importance measures. Sensitivity, specificity, and area under the ROC curve were computed via ten-fold cross validation and compared via pairwise t-tests. These methods were also compared to a univariate logistic regression of the Morse Fall Scale total score. Results: In terms of AUROC, bagging (0.89), random forest (0.90), and boosting (0.89) all outperformed the Morse Fall Scale (0.86) and the classification tree (0.85), but no differences were measured between bagging, random forest, and adaptive boosting, at a p-value of 0.05. History of Falls, Age, Morse Fall Scale total score, quality of gait, unit type, mental status, and number of high fall risk increasing drugs (FRIDs) were considered the most important features for predicting inpatient fall risk. Conclusions: Machine learning methods have the potential to identify the most relevant and novel factors for the detection of hospitalized patients at risk of falling, which would improve the quality of patient care, and to more fully support healthcare provider and organizational leadership decision-making. Nurses would be able to enhance their judgement to caring for patients at risk for falls. Our study may also serve as a reference for the development of AI-based prediction models of other iatrogenic conditions. To our knowledge, this is the first study to report the importance of patient, clinical, and organizational features based on the use of AI approaches.
Delirium is a common, often underdiagnosed, geriatric syndrome characterized by an acute change in attention and consciousness. As a neuropsychiatric disorder with an underlying organic cause, delirium has been considered a diagnosis reserved for the hospital setting. However, delirium is known to occur as both an acute and subacute condition that carries significant morbidity and mortality. Combined with its association with dementia and aging, this makes delirium an important topic for primary care providers to become more familiar with as they are tasked with caring for an aging population.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.