Preeclampsia is one of the leading causes of maternal and fetal morbidity and mortality. Due to the lack of effective preventive measures, its prediction is essential to its prompt management. This study aimed to develop models using machine learning to predict late-onset preeclampsia using hospital electronic medical record data. The performance of the machine learning based models and models using conventional statistical methods were also compared. A total of 11,006 pregnant women who received antenatal care at Yonsei University Hospital were included. Maternal data were retrieved from electronic medical records during the early second trimester to 34 weeks. The prediction outcome was late-onset preeclampsia occurrence after 34 weeks’ gestation. Pattern recognition and cluster analysis were used to select the parameters included in the prediction models. Logistic regression, decision tree model, naïve Bayes classification, support vector machine, random forest algorithm, and stochastic gradient boosting method were used to construct the prediction models. C-statistics was used to assess the performance of each model. The overall preeclampsia development rate was 4.7% (474 patients). Systolic blood pressure, serum blood urea nitrogen and creatinine levels, platelet counts, serum potassium level, white blood cell count, serum calcium level, and urinary protein were the most influential variables included in the prediction models. C-statistics for the decision tree model, naïve Bayes classification, support vector machine, random forest algorithm, stochastic gradient boosting method, and logistic regression models were 0.857, 0.776, 0.573, 0.894, 0.924, and 0.806, respectively. The stochastic gradient boosting model had the best prediction performance with an accuracy and false positive rate of 0.973 and 0.009, respectively. The combined use of maternal factors and common antenatal laboratory data of the early second trimester through early third trimester could effectively predict late-onset preeclampsia using machine learning algorithms. Future prospective studies are needed to verify the clinical applicability algorithms.
Hyperphosphatemia is associated with mortality in patients with chronic kidney disease, and is common in critically ill patients with acute kidney injury (AKI); however, its clinical implication in these patients is unknown. We conducted an observational study in 1144 patients (mean age, 63.2 years; male, 705 [61.6%]) with AKI who received continuous renal replacement therapy (CRRT) between January 2009 and September 2016. Phosphate levels were measured before (0 h) and 24 h after CRRT initiation. We assessed disease severity using various clinical parameters. Phosphate at 0 h positively correlated with the Acute Physiology and Chronic Health Evaluation II (APACHE II; P < 0.001) and Sequential Organ Failure Assessment (SOFA; P < 0.001) scores, and inversely with mean arterial pressure (MAP; P = 0.02) and urine output (UO; P = 0.01). In a fully adjusted linear regression analysis for age, sex, Charlson comorbidity index (CCI), MAP, and estimated glomerular filtration rate (eGFR), higher 0 h phosphate level was significantly associated with high APACHE II (P < 0.001) and SOFA (P = 0.04) scores, suggesting that phosphate represents disease severity. A multivariable Cox model also showed that hyperphosphatemia was significantly associated with increased 28-day (HR 1.05, 95% CI 1.02–1.08, P = 0.001) and 90-day (HR 1.05, 95% CI 1.02–1.08, P = 0.001) mortality. Furthermore, patients with increased phosphate level during 24 h were at higher risk of death than those with stable or decreased phosphate levels. Finally, c-statistics significantly increased when phosphate was added to a model that included age, sex, CCI, body mass index, eGFR, MAP, hemoglobin, serum albumin, C-reactive protein, and APACHE II score. This study shows that phosphate is a potential biomarker that can reflect disease severity and predict mortality in critically ill patients receiving CRRT.
Our findings suggest that warfarin should be used carefully in hemodialysis patients, given the higher risk of hemorrhagic events and the lack of ability to prevent thromboembolic complications.
The association between salt intake and renal outcome in subjects with preserved kidney function remains unclear. Here we evaluated the effect of sodium intake on the development of chronic kidney disease (CKD) in a prospective cohort of people with normal renal function. Data were obtained from the Korean Genome and Epidemiology Study, a prospective community-based cohort study while sodium intake was estimated by a 24-hour dietary recall Food Frequency Questionnaire. A total of 3,106 individuals with and 4,871 patients without hypertension were analyzed with a primary end point of CKD development [a composite of estimated glomerular filtration rate (eGFR) under 60 mL/min/1.73 m and/or development of proteinuria during follow-up]. The median ages were 55 and 47 years, the proportions of males 50.9% and 46.3%, and the median eGFR 92 and 96 mL/min/1.73 m in individuals with and without hypertension, respectively. During a median follow-up of 123 months in individuals with hypertension and 140 months in those without hypertension, CKD developed in 27.8% and 16.5%, respectively. After adjusting for confounders, multiple Cox models indicated that the risk of CKD development was significantly higher in people with hypertension who consumed less than 2.08 g/day or over 4.03 g/day sodium than in those who consumed between 2.93-4.03 g/day sodium. However, there was no significant difference in the incident CKD risk among each quartile of people without hypertension. Thus, both high and low sodium intakes were associated with increased risk for CKD, but this relationship was only observed in people with hypertension.
Background
The effect of a high-protein diet with renal hyperfiltration (RHF) on decline of kidney function has rarely been explored. We investigated the association between a high-protein diet, RHF and declining kidney function.
Methods
A total of 9226 subjects from the Korean Genome and Epidemiology Study, a community-based prospective study (2001–14), were enrolled and classified into quartiles according to daily amount of protein intake based on food frequency questionnaires. RHF was defined as estimated glomerular filtration rate (eGFR) with residuals of >95th percentile after adjustment for age, sex, history of hypertension or diabetes, height and weight. Rapid decline of renal function was defined as decline rate of eGFR >3 mL/min/1.73 m2/year.
Results
The relative risk of RHF was 3.48-fold higher in the highest than in the lowest protein intake quartile after adjustment for confounding factors [95% confidence interval (CI) 1.39–8.71]. The mean eGFR decline rate was faster as quartiles of protein intake increased. Furthermore, the highest quartile was associated with 1.32-fold increased risk of rapid eGFR decline (95% CI 1.02–1.73). When subjects were divided into two groups with or without RHF, the highest quartile was associated with a rapid decline in renal function only in RHF subjects (odds ratio 3.35; 95% CI 1.07–10.51). The sensitivity analysis using the Korean National Health and Nutrition Examination Survey (2008–15) data with 40 113 subjects showed that higher quartile was associated with increased risk for RHF.
Conclusions
A high-protein diet increases the risk of RHF and a rapid renal function decline in the general population. These findings suggest that a high-protein diet has a deleterious effect on renal function in the general population.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.