2018
DOI: 10.1177/1557988318814295
|View full text |Cite
|
Sign up to set email alerts
|

Impact of a Pharmacist-Led Intervention on 30-Day Readmission and Assessment of Factors Predictive of Readmission in African American Men With Heart Failure

Abstract: Heart failure (HF) is responsible for more 30-day readmissions than any other condition. Minorities, particularly African American males (AAM), are at much higher risk for readmission than the general population. In this study, demographic, social, and clinical data were collected from the electronic medical records of 132 AAM patients (control and intervention) admitted with a primary or secondary admission diagnosis of HF. Both groups received guideline-directed therapy for HF. Additionally the intervention … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 33 publications
0
10
0
1
Order By: Relevance
“…The second most popular algorithm was NN (14, 33%): many studies used multiple hidden layers based deep learning techniques [ 60 , 69 71 , 77 , 79 , 80 , 85 – 87 ] (e.g., recurrent NN, convolutional NN, deep NN, and ensemble of DL networks), while a few other studies either used one hidden layer [ 58 , 60 , 68 ] or did not specify the number of layers [ 49 , 66 ]. Regularized logistic regression (12, 28%), including Least Absolute Shrinkage and Selection Operator (LASSO) regression [ 53 , 64 , 65 , 67 , 70 , 71 , 78 80 ] (L1 regularization), ridge regression [ 64 , 70 , 71 , 80 ] (L2 regularization) and elastic-net [ 49 , 72 , 81 ]were third most used ML algorithm, followed by Support Vector Machine (SVM) [ 54 , 60 , 63 , 65 , 66 , 70 , 71 , 82 84 ] (10, 23%). The other less commonly used ML algorithms included naïve Bayes network [ 49 , 54 , 70 , 84 ], K-Nearest Neighbors (KNN) algorithm [ 54 , 65 ], ensemble of methods [ 50 , 67 , 84 ], ,and Bayesian Model averaging [ 49 ].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The second most popular algorithm was NN (14, 33%): many studies used multiple hidden layers based deep learning techniques [ 60 , 69 71 , 77 , 79 , 80 , 85 – 87 ] (e.g., recurrent NN, convolutional NN, deep NN, and ensemble of DL networks), while a few other studies either used one hidden layer [ 58 , 60 , 68 ] or did not specify the number of layers [ 49 , 66 ]. Regularized logistic regression (12, 28%), including Least Absolute Shrinkage and Selection Operator (LASSO) regression [ 53 , 64 , 65 , 67 , 70 , 71 , 78 80 ] (L1 regularization), ridge regression [ 64 , 70 , 71 , 80 ] (L2 regularization) and elastic-net [ 49 , 72 , 81 ]were third most used ML algorithm, followed by Support Vector Machine (SVM) [ 54 , 60 , 63 , 65 , 66 , 70 , 71 , 82 84 ] (10, 23%). The other less commonly used ML algorithms included naïve Bayes network [ 49 , 54 , 70 , 84 ], K-Nearest Neighbors (KNN) algorithm [ 54 , 65 ], ensemble of methods [ 50 , 67 , 84 ], ,and Bayesian Model averaging [ 49 ].…”
Section: Resultsmentioning
confidence: 99%
“…( N = 43 studies) Type of ML Algorithms Number of Studies f (Percent) Featuring Studies Tree-based methods 23 (53%) Decision Tree 9 [ 46 , 52 , 54 60 ] Random Forest 16 [ 48 50 , 59 71 ] Boosted tree methods a 18 [ 47 , 49 – 51 , 53 , 54 , 59 , 64 67 , 71 77 ] Regularized Logistic Regression (penalized method) 12 (28%) Lasso (L1 regularization) 9 [ 53 , 64 , 65 , 67 , 70 , 71 , 78 80 ] Ridge Regression (L2 regularization) 4 [ 64 , 70 , 71 , 80 ] Elastic-Net 3 [ 49 , 72 , 81 ] Support Vector Machine 10 (23%) [ 54 , 60 , …”
Section: Resultsmentioning
confidence: 99%
“…Support vector machines were the second most utilized ML algorithm ( n = 9), followed by neural networks ( n = 7). The remaining studies presented combinations of these with other ML algorithms such as deep learning, 16,17 Bayesian techniques (including naïve Bayes classifiers and Bayesian networks), 18,19 and K ‐nearest neighbour 20 . Several studies employed multiple ML algorithms and compared them with one or more CSMs.…”
Section: Resultsmentioning
confidence: 99%
“…The remaining studies presented combinations of these with other ML algorithms such as deep learning, 16,17 Bayesian techniques (including naïve Bayes classifiers and Bayesian networks), 18,19 and K-nearest neighbour. 20 Several studies employed multiple ML algorithms and compared them with one or more CSMs. Many studies took advantage of ensemble learning algorithms, which are ML techniques that aggregate the outcomes of multiple-based trained models, producing a unified general result for each data sample (e.g.…”
Section: Machine Learning Methodsmentioning
confidence: 99%
“…In total, 30% of studies mentioned Black patients. Between 0.95% and 100% of the individuals were Black, with one study enrolling only African American males with heart failure [20] .…”
Section: Resultsmentioning
confidence: 99%