2022 Fifth International Conference on Computational Intelligence and Communication Technologies (CCICT) 2022
DOI: 10.1109/ccict56684.2022.00062
|View full text |Cite
|
Sign up to set email alerts
|

Application of ensemble Machine Learning models for phishing detection on web networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 19 publications
0
4
0
Order By: Relevance
“…Working Description KNC K-Neighbors Classifier (KNC) method [25], [26] operates by classifying data points based on the majority class among their k-nearest neighbors in a feature space. In the context of SQL injection attack detection, it analyzes the characteristics of queries and compares them to a labeled dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Working Description KNC K-Neighbors Classifier (KNC) method [25], [26] operates by classifying data points based on the majority class among their k-nearest neighbors in a feature space. In the context of SQL injection attack detection, it analyzes the characteristics of queries and compares them to a labeled dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Creating efficient detection systems is aided by the ideas and models already in existence in machine learning and phishing detection [52]. However, a well-liked machine learning model for phishing detection uses decision trees [53].…”
Section: Theoretical Reviewmentioning
confidence: 99%
“…). This ensured a fair chance of accurately predicting each class by preventing the model from being biased towards the dominant class [52]. During this phase, the X_train and y_train datasets were subjected to oversampling, wherein the minority class's data points were randomly selected and duplicated until a balance was achieved, thereby fostering a more equitable learning environment during model training.…”
Section: Automatic Feature Selectionmentioning
confidence: 99%
“…The model's accuracy could be much higher due to the proposed method utilizing only 1%, 5%, and 10% of the dataset, which might cause a substantial loss of information. The voting classifier method was also implemented by Puri et al [28]. The research utilized the SMOTE algorithm to normalize the dataset, which resulted in more extensive data.…”
Section: Related Workmentioning
confidence: 99%