Background:
Right bundle branch block (RBBB) is among the most common electrocardiographic abnormalities.
Objectives:
To establish the prevalence and incidence of RBBB in the general population without cardiovascular events (CVE) and whether RBBB increases cardiovascular morbidity and mortality compared with patients with a normal electrocardiogram (ECG).
Methods:
A historical study of two cohorts including 2981 patients from 29 primary health centres without baseline CVE. Cox (for CVE) and logistic (for cardiovascular factors) regression was used to assess their association with RBBB.
Results:
Of the patients (58% women; mean age 65.9), 92.2% had a normal ECG, 4.6% incomplete RBBB (iRBBB) and 3.2% complete RBBB (cRBBB). Mean follow-up was five years. Factors associated with appearance of cRBBB were male sex (HR = 3.8; 95%CI: 2.4–6.1) and age (HR = 1.05 per year; 95%CI: 1.03–1.08). In a univariate analysis, cRBBB was associated with an increase in all-cause mortality but only bifascicular block (BFB) was significant after adjusting for confounders. cRBBB tended to increase CVE but the results were not statistically significant. Presence of iRBBB was not associated with adverse outcomes. Patients with iRBBB who progressed to cRBBB showed a higher incidence of heart failure and chronic kidney disease.
Conclusion:
In this general population cohort with no CV disease, 8% had RBBB, with a higher prevalence among men and elderly patients. Although all-cause mortality and CVE tended to increase in the presence of cRBBB, only BFB showed a statistically significant association with cRBBB. Patients with iRBBB who progressed to cRBBB had a higher incidence of CVE. We detected no effect of iRBBB on morbidity and mortality.
Follow-up for hypertension in Catalonia has improved notably since 1996, but there was no improvement in the diagnosis of risk factors or in the integral evaluation of cardiovascular risk.
Background
Primary care is the major point of access in most health systems in developed countries and therefore for the detection of coronavirus disease 2019 (COVID-19) cases. The quality of its IT systems, together with access to the results of mass screening with Polymerase chain reaction (PCR) tests, makes it possible to analyse the impact of various concurrent factors on the likelihood of contracting the disease.
Methods and findings
Through data mining techniques with the sociodemographic and clinical variables recorded in patient’s medical histories, a decision tree-based logistic regression model has been proposed which analyses the significance of demographic and clinical variables in the probability of having a positive PCR in a sample of 7,314 individuals treated in the Primary Care service of the public health system of Catalonia. The statistical approach to decision tree modelling allows 66.2% of diagnoses of infection by COVID-19 to be classified with a sensitivity of 64.3% and a specificity of 62.5%, with prior contact with a positive case being the primary predictor variable.
Conclusions
The use of a classification tree model may be useful in screening for COVID-19 infection. Contact detection is the most reliable variable for detecting Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) cases. The model would support that, beyond a symptomatic diagnosis, the best way to detect cases would be to engage in contact tracing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.