Background SARS-CoV-2 IgG antibody measurements can be used to estimate the proportion of a population exposed or infected and may be informative about the risk of future infection. Previous estimates of the duration of antibody responses vary. Methods We present 6 months of data from a longitudinal seroprevalence study of 3276 UK healthcare workers (HCWs). Serial measurements of SARS-CoV-2 anti-nucleocapsid and anti-spike IgG were obtained. Interval censored survival analysis was used to investigate the duration of detectable responses. Additionally, Bayesian mixed linear models were used to investigate anti-nucleocapsid waning. Results Anti-spike IgG levels remained stably detected after a positive result, e.g., in 94% (95% credibility interval, CrI, 91-96%) of HCWs at 180 days. Anti-nucleocapsid IgG levels rose to a peak at 24 (95% credibility interval, CrI 19-31) days post first PCR-positive test, before beginning to fall. Considering 452 anti-nucleocapsid seropositive HCWs over a median of 121 days from their maximum positive IgG titre, the mean estimated antibody half-life was 85 (95%CrI, 81-90) days. Higher maximum observed anti-nucleocapsid titres were associated with longer estimated antibody half-lives. Increasing age, Asian ethnicity and prior self-reported symptoms were independently associated with higher maximum anti-nucleocapsid levels and increasing age and a positive PCR test undertaken for symptoms with longer anti-nucleocapsid half-lives. Conclusion SARS-CoV-2 anti-nucleocapsid antibodies wane within months, and faster in younger adults and those without symptoms. However, anti-spike IgG remains stably detected. Ongoing longitudinal studies are required to track the long-term duration of antibody levels and their association with immunity to SARS-CoV-2 reinfection.
Background COPD is a highly heterogeneous disease composed of different phenotypes with different aetiological and prognostic profiles and current classification systems do not fully capture this heterogeneity. In this study we sought to discover, describe and validate COPD subtypes using cluster analysis on data derived from electronic health records. Methods We applied two unsupervised learning algorithms (k-means and hierarchical clustering) in 30,961 current and former smokers diagnosed with COPD, using linked national structured electronic health records in England available through the CALIBER resource. We used 15 clinical features, including risk factors and comorbidities and performed dimensionality reduction using multiple correspondence analysis. We compared the association between cluster membership and COPD exacerbations and respiratory and cardiovascular death with 10,736 deaths recorded over 146,466 person-years of follow-up. We also implemented and tested a process to assign unseen patients into clusters using a decision tree classifier. Results We identified and characterized five COPD patient clusters with distinct patient characteristics with respect to demographics, comorbidities, risk of death and exacerbations. The four subgroups were associated with 1) anxiety/depression; 2) severe airflow obstruction and frailty; 3) cardiovascular disease and diabetes and 4) obesity/atopy. A fifth cluster was associated with low prevalence of most comorbid conditions. Conclusions COPD patients can be sub-classified into groups with differing risk factors, comorbidities, and prognosis, based on data included in their primary care records. The identified clusters confirm findings of previous clustering studies and draw attention to anxiety and depression as important drivers of the disease in young, female patients. Electronic supplementary material The online version of this article (10.1186/s12911-019-0805-0) contains supplementary material, which is available to authorized users.
Background Natural and vaccine-induced immunity will play a key role in controlling the SARS-CoV-2 pandemic. SARS-CoV-2 variants have the potential to evade natural and vaccine-induced immunity. Methods In a longitudinal cohort study of healthcare workers (HCWs) in Oxfordshire, UK, we investigated the protection from symptomatic and asymptomatic PCR-confirmed SARS-CoV-2 infection conferred by vaccination (Pfizer-BioNTech BNT162b2, Oxford-AstraZeneca ChAdOx1 nCOV-19) and prior infection (determined using anti-spike antibody status), using Poisson regression adjusted for age, sex, temporal changes in incidence and role. We estimated protection conferred after one versus two vaccinations and from infections with the B.1.1.7 variant identified using whole genome sequencing. Results 13,109 HCWs participated; 8285 received the Pfizer-BioNTech vaccine (1407 two doses) and 2738 the Oxford-AstraZeneca vaccine (49 two doses). Compared to unvaccinated seronegative HCWs, natural immunity and two vaccination doses provided similar protection against symptomatic infection: no HCW vaccinated twice had symptomatic infection, and incidence was 98% lower in seropositive HCWs (adjusted incidence rate ratio 0.02 [95%CI <0.01-0.18]). Two vaccine doses or seropositivity reduced the incidence of any PCR-positive result with or without symptoms by 90% (0.10 [0.02-0.38]) and 85% (0.15 [0.08-0.26]) respectively. Single-dose vaccination reduced the incidence of symptomatic infection by 67% (0.33 [0.21-0.52]) and any PCR-positive result by 64% (0.36 [0.26-0.50]). There was no evidence of differences in immunity induced by natural infection and vaccination for infections with S-gene target failure and B.1.1.7. Conclusion Natural infection resulting in detectable anti-spike antibodies and two vaccine doses both provide robust protection against SARS-CoV-2 infection, including against the B.1.1.7 variant.
AimTo assess the impact of deprivation on diabetic retinopathy presentation and related treatment interventions, as observed within the UK hospital eye service.MethodsThis is a multicentre, national diabetic retinopathy database study with anonymised data extraction across 22 centres from an electronic medical record system. The following were the inclusion criteria: all patients with diabetes and a recorded, structured diabetic retinopathy grade. The minimum data set included, for baseline, age and Index of Multiple Deprivation, based on residential postcode; and for all time points, visual acuity, ETDRS grading of retinopathy and maculopathy, and interventions (laser, intravitreal therapies and surgery). The main outcome measures were (1) visual acuity and binocular visual state, and (2) presence of sight-threatening complications and need for early treatment.Results79 775 patients met the inclusion criteria. Deprivation was associated with later presentation in patients with diabetic eye disease: the OR of being sight-impaired at entry into the hospital eye service (defined as 6/18 to better than 3/60 in the better seeing eye) was 1.29 (95% CI 1.20 to 1.39) for the most deprived decile vs 0.77 (95% CI 0.70 to 0.86) for the least deprived decile; the OR for being severely sight-impaired (3/60 or worse in the better seeing eye) was 1.17 (95% CI 0.90 to 1.55) for the most deprived decile vs 0.88 (95% CI 0.61 to 1.27) for the least deprived decile (reference=fifth decile in all cases). There is also variation in sight-threatening complications at presentation and treatment undertaken: the least deprived deciles had lower chance of having a tractional retinal detachment (OR=0.48 and 0.58 for deciles 9 and 10, 95% CI 0.24 to 0.90 and 0.29 to 1.09, respectively); in terms of accessing treatment, the rate of having a vitrectomy was lowest in the most deprived cohort (OR=0.34, 95% CI 0.19 to 0.58).ConclusionsThis large real-world study suggests that first presentation at a hospital eye clinic with visual loss or sight-threatening diabetic eye disease is associated with deprivation. These initial hospital visits represent the first opportunities to receive treatment and to formally engage with support services. Such patients are more likely to be sight-impaired or severely sight-impaired at presentation, and may need additional resources to engage with the hospital eye services over complex treatment schedules.
Background/AimsTo assess the effectiveness, burden and safety of two categories of treatment for central retinal vein occlusion (CRVO): intravitreal injections of anti-vascular endothelial growth factor (anti-VEGF) and dexamethasone (Ozurdex).MethodsA retrospective analysis of Medisoft electronic medical record (EMR) data from 27 National Health Service sites in the UK identified 4626 treatment-naive patients with a single mode of treatment for macular oedema secondary to CRVO. Statistics describing the overall CRVO patient cohort and individual patient subpopulations stratified by treatment type were generated. Mean age at baseline, gender, ethnicity, social deprivation and visual acuity (VA) follow-up was reported. Absolute and change in VA using ETDRS are used to describe treatment effectiveness, the number of injections and visits used to describe treatment burden and endophthalmitis rates as a marker of treatment safety.ResultsMean VA was 47.9 and 45.3 EDTRS letters in the anti-VEGF and Ozurdex groups, respectively. This changed to 57.9/53.7 at 12 months, 58.3/46.9 at 18 months and 59.4/51.0 at 36 months. Mean number of injections were 5.6/1.6 at 12 months, 6.0/1.7 at 18 months and 7.0/1.8 at 36 months. Endophthalmitis rates were 0.003% (n=4) for the anti-VEGF group and 0.09% (n=1) for the Ozurdex group.ConclusionsVA improvements were greater and more sustained with anti-VEGF treatment. Lower starting acuity resulted in bigger gains in both groups, while higher starting acuity resulted in higher VA at 36 months. Although treatment burden was greater with anti-VEGF, Ozurdex was associated with higher rates of endophthalmitis.
Background/aimsClinical trials suggest anti-vascular endothelial growth factor is more effective than intravitreal dexamethasone as treatment for macular oedema secondary to branch retinal vein occlusion. This study asks if ‘real world’ data from a larger and more diverse population, followed for a longer period, also support this conclusion.MethodsData collected to support routine care at 27 NHS (National Health Service) Trusts between February 2002 and September 2017 contained 5661 treatment-naive patients with a single mode of treatment for macular oedema secondary to branch retinal vein occlusion and no history of cataract surgery either during or recently preceding the treatment. Number of treatment visits and change in visual acuity from baseline was plotted for three treatment groups (anti-vascular endothelial growth factor (anti-VEGF), intravitreal dexamethasone, macular laser) for up to 3 years.ResultsMean baseline visual acuity was 57.1/53.1/62.3 letters in the anti-VEGF/dexamethasone/macular laser groups, respectively. This changed to 66.72 (+9.6)/57.6 (+4.5)/63.2 (+0.9) at 12 months. Adequate numbers allowed analysis at 18 months for all groups (66.6 (+9.5)/56.1 (+3.0)/60.8 (-1.5)) and for anti-VEGF at 36 months (68.0, +10.9) Mean number of treatments were 5.1/1.5/1.2 at 12 months, 5.9/1.7/1.2 at 18 months for all three groups and 10.3 at 36 months for anti-VEGF.ConclusionsVisual acuity improvements were higher and more sustained with anti-VEGF. Higher treatment burden occurred with anti-VEGF but this reduced over 36 months. Patients with better vision at baseline than those in the clinical trials maintained high levels of vision with both anti-VEGF and dexamethasone.
BackgroundThe ability of external investigators to reproduce published scientific findings is critical for the evaluation and validation of biomedical research by the wider community. However, a substantial proportion of health research using electronic health records (EHR), data collected and generated during clinical care, is potentially not reproducible mainly due to the fact that the implementation details of most data preprocessing, cleaning, phenotyping and analysis approaches are not systematically made available or shared. With the complexity, volume and variety of electronic health record data sources made available for research steadily increasing, it is critical to ensure that scientific findings from EHR data are reproducible and replicable by researchers. Reporting guidelines, such as RECORD and STROBE, have set a solid foundation by recommending a series of items for researchers to include in their research outputs. Researchers however often lack the technical tools and methodological approaches to actuate such recommendations in an efficient and sustainable manner.ResultsIn this paper, we review and propose a series of methods and tools utilized in adjunct scientific disciplines that can be used to enhance the reproducibility of research using electronic health records and enable researchers to report analytical approaches in a transparent manner. Specifically, we discuss the adoption of scientific software engineering principles and best-practices such as test-driven development, source code revision control systems, literate programming and the standardization and re-use of common data management and analytical approaches.ConclusionThe adoption of such approaches will enable scientists to systematically document and share EHR analytical workflows and increase the reproducibility of biomedical research using such complex data sources.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.