In natural environments that contain multiple sound sources, acoustic energy arising from the different sources sums to produce a single complex waveform at each of the listener's ears. The auditory system must segregate this waveform into distinct streams to permit identification of the objects from which the signals emanate [1]. Although the processes involved in stream segregation are now reasonably well understood [1, 2 and 3], little is known about the nature of our perception of complex auditory scenes. Here, we examined complex scene perception by having listeners detect a discrete change to an auditory scene comprising multiple concurrent naturalistic sounds. We found that listeners were remarkably poor at detecting the disappearance of an individual auditory object when listening to scenes containing more than four objects, but they performed near perfectly when their attention was directed to the identity of a potential change. In the absence of directed attention, this "change deafness" [4] was greater for objects arising from a common location in space than for objects separated in azimuth. Change deafness was also observed for changes in object location, suggesting that it may reflect a general effect of the dependence of human auditory perception on attention.
Background: The translation of evidence on dementia risk factors into clinical advice requires careful evaluation of the methodology and scope of data from which risk estimates are obtained. Objective: To evaluate the quantity, quality, and representativeness of evidence, we conducted a review of reviews of risk factors for Alzheimer’s disease (AD), Vascular dementia (VaD), and Any Dementia. Methods: PubMed, Cochrane library, and the Global Index Medicus were searched to identify meta-analyses of observational studies of risk factors for AD, VaD, and Any Dementia. PROSPERO CRD42017053920. Results: Meta-analysis data were available for 34 risk factors for AD, 26 risk factors for Any Dementia and eight for VaD. Quality of evidence varied greatly in terms of the number of contributing studies, whether data on midlife exposure was available, and consistency of measures. The most evidence was available for cardiovascular risk factors. The most geographically representative evidence (five of six global regions) was available for alcohol, physical activity, diabetes, high midlife BMI, antihypertensives, and motor function. Evidence from Australia/Oceana or Africa was limited. With the exception of diabetes, meta-analysis data were unavailable from Latin America/Caribbean. Midlife specific data were only available for cholesterol and arthritis. Conclusion: There is a lack of midlife specific data, limited data on VaD, and a lack of geographical representation for many risk factors for dementia. The quality, quantity, and representativeness of evidence needs to be considered before recommendations are made about the relevance of risk factors in mid- or late-life or for dementia subtypes.
IntroductionAssociations between the Mediterranean‐DASH diet Intervention for Neurological Delay (MIND) diet and incidence of cognitive impairment have not been evaluated outside the United States.MethodsWe investigated MIND and Mediterranean diet relations with 12‐year incidence of Alzheimer's disease/Vascular dementia (National Institute of Neurological Disorders criteria) and mild cognitive impairment (Winbald criteria) in the Personality and Total Health (PATH) Through Life cohort (n = 1220) set in Canberra, Australia: wave‐1 2001‐2002; wave‐2 2005‐2006; wave‐3 2009‐2010; and wave‐4 2013‐2014.MIND diet and two alternate Mediterranean diet scores were calculated from the baseline food frequency questionnaire responses. Higher dietary scores signified greater adherence.ResultsIn adjusted logistic regression models, MIND diet (OR = 0.47, 95% CI 0.24, 0.91), but not Mediterranean diet, was associated with reduced odds of 12‐year cognitive impairment.DiscussionPreliminary evidence suggests that protective effects of the MIND diet are geographically generalizable. Additional prospective studies are needed in diverse samples to determine the relative effects of the MIND and the Mediterranean diets against cognitive decline.
BackgroundA physically active lifestyle has the potential to prevent cognitive decline and dementia, yet the optimal type of physical activity/exercise remains unclear. Dance is of special interest as it complex sensorimotor rhythmic activity with additional cognitive, social, and affective dimensions.ObjectivesTo determine whether dance benefits executive function more than walking, an activity that is simple and functional.MethodsTwo-arm randomized controlled trial among community-dwelling older adults. The intervention group received 1 h of ballroom dancing twice weekly over 8 months (~69 sessions) in local community dance studios. The control group received a combination of a home walking program with a pedometer and optional biweekly group-based walking in local community park to facilitate socialization.Main outcomesExecutive function tests: processing speed and task shift by the Trail Making Tests, response inhibition by the Stroop Color-Word Test, working memory by the Digit Span Backwards test, immediate and delayed verbal recall by the Rey Auditory Verbal Learning Test, and visuospatial recall by the Brief Visuospatial Memory Test (BVST).ResultsOne hundred and fifteen adults (mean 69.5 years, SD 6.4) completed baseline and delayed baseline (3 weeks apart) before being randomized to either dance (n = 60) or walking (n = 55). Of those randomized, 79 (68%) completed the follow-up measurements (32 weeks from baseline). In the dance group only, “non-completers” had significantly lower baseline scores on all executive function tests than those who completed the full program. Intention-to-treat analyses showed no group effect. In a random effects model including participants who completed all measurements, adjusted for baseline score and covariates (age, education, estimated verbal intelligence, and community), a between-group effect in favor of dance was noted only for BVST total learning (Cohen’s D Effect size 0.29, p = 0.07) and delayed recall (Cohen’s D Effect size = 0.34, p = 0.06).ConclusionThe superior potential of dance over walking on executive functions of cognitively healthy and active older adults was not supported. Dance improved one of the cognitive domains (spatial memory) important for learning dance. Controlled trials targeting inactive older adults and of a higher dose may produce stronger effects, particularly for novice dancers.Trial registrationAustralian and New Zealand Clinical Trials Register (ACTRN12613000782730).
Background: With population aging, drivers with mild cognitive impairment (MCI) are increasing; however, there is little evidence available regarding their safety.Objective: We aimed to evaluate risk of unsafe on-road driving performance among older adults with MCI.Method: The study was a cross-sectional observational study, set in Canberra, Australia. Participants were non-demented, current drivers (n = 302) aged 65 to 96 years (M = 75.7, SD = 6.18, 40% female) recruited through the community and primary and tertiary care clinics. Measures included a standardized on-road driving test (ORT), a battery of screening measures designed to evaluate older driver safety (UFOV®, DriveSafe, Multi-D), a neurocognitive test battery, and questionnaires on driving history and behavior.Results: Using Winblad criteria, 57 participants were classified as having MCI and 245 as cognitively normal (CN). While the MCI group had a significantly lower overall safety rating on the ORT (5.61 versus 6.05, p = 0.03), there was a wide range of driving safety scores in the CN and MCI groups. The MCI group performed worse than the CN group on the off-road screening tests. The best fitting model of predictors of ORT performance across the combined sample included age, the Multi-D, and DriveSafe, classifying 90.4% of the sample correctly.Conclusion: Adults with MCI exhibit a similar range of driving ability to CN adults, although on average they scored lower on off-road and on-road assessments. Driving specific tests were more strongly associated with safety ratings than traditional neuropsychological tests.
This study examined the prevalence of co-morbid age-related eye disease and symptoms of depression and anxiety in late life, and the relative roles of visual function and disease in explaining symptoms of depression and anxiety. A community-based sample of 662 individuals aged over 70 years was recruited through the electoral roll. Vision was measured using a battery of tests including high and low contrast visual acuity, contrast sensitivity, motion sensitivity, stereoacuity, Useful Field of View, and visual fields. Depression and anxiety symptoms were measured using the Goldberg scales. The prevalence of self-reported eye disease [cataract, glaucoma, or age-related macular degeneration (AMD)] in the sample was 43.4%, with 7.7% reporting more than one form of ocular pathology. Of those with no eye disease, 3.7% had clinically significant depressive symptoms. This rate was 6.7% among cataract patients, 4.3% among those with glaucoma, and 10.5% for AMD. Generalized linear models adjusting for demographics, general health, treatment, and disability examined self-reported eye disease and visual function as correlates of depression and anxiety. Depressive symptoms were associated with cataract only, AMD, comorbid eye diseases and reduced low contrast visual acuity. Anxiety was significantly associated with self-reported cataract, and reduced low contrast visual acuity, motion sensitivity and contrast sensitivity. We found no evidence for elevated rates of depressive or anxiety symptoms associated with self-reported glaucoma. The results support previous findings of high rates of depression and anxiety in cataract and AMD, and in addition show that mood and anxiety are associated with objective measures of visual function independently of self-reported eye disease. The findings have implications for the assessment and treatment of mental health in the context of late-life visual impairment.
Sex differences in late-life memory decline may be explained by sex differences in dementia risk factors. Episodic memory and dementia risk factors were assessed in young, middle-aged and older adults over 12 years in a population-based sample (N = 7485). For men in midlife and old age, physical, cognitive and social activities were associated with less memory decline, and financial hardship was associated with more. APOE e4 and vascular risk factors were associated with memory decline for women in midlife. Depression, cognitive and physical activity were associated with memory change in older women. Incident midlife hypertension (β = − 0.48, 95% CI − 0.87, − 0.09, p = 0.02) was associated with greater memory decline in women and incident late-life stroke accounted for greater memory decline in men (β = − 0.56, 95% CI − 1.12, − 0.01), p = 0.05). Women have fewer modifiable risk factors than men. Stroke and hypertension explained sex differences in memory decline for men and women respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.