Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2021
DOI: 10.1007/978-3-030-79276-3_14
|View full text |Cite
|
Sign up to set email alerts
|

Fraud Detection in Credit Card Transaction Using ANN and SVM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 15 publications
0
2
0
1
Order By: Relevance
“…In addition, RF ensemble yields an F1score of 99.19% after the application of the chi-squared feature selection approach; while other ensembles (i.e., Logistic Regression, KNN, Naïve Bayes, and Support Vector Machine) yielded an accuracy of 98.05%, 92.10%, 91.25% and 81.45% respectively. It is clearly observed and seen that the adaption of both the feature selection approach and SMOTE data balancing approach ensured improved accuracy when compared with the results yielded in the studies [46], [49], [104], [105]; This is as in Table 4, and also in agreement with [106], [107].…”
Section: Ensemble Performancesupporting
confidence: 77%
“…In addition, RF ensemble yields an F1score of 99.19% after the application of the chi-squared feature selection approach; while other ensembles (i.e., Logistic Regression, KNN, Naïve Bayes, and Support Vector Machine) yielded an accuracy of 98.05%, 92.10%, 91.25% and 81.45% respectively. It is clearly observed and seen that the adaption of both the feature selection approach and SMOTE data balancing approach ensured improved accuracy when compared with the results yielded in the studies [46], [49], [104], [105]; This is as in Table 4, and also in agreement with [106], [107].…”
Section: Ensemble Performancesupporting
confidence: 77%
“…Study [26] conducted experiments on the Credit Card Fraud Detection dataset using different machine learning algorithms to evaluate their performance. They compared five algorithms, including SVM, Naive Bayes, Logistic Regression, KNN, and Random Forest.…”
Section: Related Workmentioning
confidence: 99%
“…Selected Algorithms Efficient Algorithm Accuracy Kibria and Sevkli [17] LR, Deep Learning, and SVM Deep Learning 87.10% Naveen and Diwan [30] LR, QDA and SVM LR 99.38% Shaji et al [26] ANN and SVM, Both 88.00% Sinayobye et al [31] KNN, DT, RF, SVM, LR KNN 82.60% Btoush et al [32] Deep Learning DL 95.76% Taha et al [33] Optimized Light Gradient Boosting Machine OLGBM 98.40%…”
Section: Authorsmentioning
confidence: 99%
“…Teori yang mendasari SVM sendiri sudah berkembang sejak 1960-an, tetapi baru diperkenalkan oleh Vapnik, Boser dan Guyon pada tahun 1992 dan sejak itu SVM berkembang dengan pesat. SVM adalah salah satu teknik yang relatif baru dibandingkan dengan teknik lain, tetapi memiliki performansi yang lebih baik di berbagai bidang aplikasi seperti pengenalan tulisan tangan, klasifikasi teks dan lain sebagainya [15]. SVM digunakan untuk mencari hyperplane terbaik dengan memaksimalkan jarak antar kelas.…”
Section: Support Vector Machineunclassified