The platform will undergo maintenance on Sep 14 at about 9:30 AM EST and will be unavailable for approximately 1 hour.
2013
DOI: 10.3745/jips.2013.9.3.395
|View full text |Cite
|
Sign up to set email alerts
|

Classifying Malicious Web Pages by Using an Adaptive Support Vector Machine

Abstract: In order to classify a web page as being benign or malicious, we designed 14 basic and 16 extended features. The basic features that we implemented were selected to represent the essential characteristics of a web page. The system heuristically combines two basic features into one extended feature in order to effectively distinguish benign and malicious pages. The support vector machine can be trained to successfully classify pages by using these features. Because more and more malicious web pages are appearin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(11 citation statements)
references
References 7 publications
0
11
0
Order By: Relevance
“…Finally Hwang, et al [9] suggested a method for classifying the malicious web pages by an adaptive support vector machine. To classify the malicious web pages, they defined the features to represent the essential characteristics of a web page and selected an adaptive support vector machine (aSVM) for learning training data.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally Hwang, et al [9] suggested a method for classifying the malicious web pages by an adaptive support vector machine. To classify the malicious web pages, they defined the features to represent the essential characteristics of a web page and selected an adaptive support vector machine (aSVM) for learning training data.…”
Section: Related Workmentioning
confidence: 99%
“…Firstly, dynamic analysis by the used features utilizes information such as the frequency or sequence of API call [1], [3]- [5], compiled hexadecimal code [2], program execution paths [8] and others [5]- [7] as the feature. Secondly, analysis by applied techniques utilizes a sequence alignment [1], [2] and data mining or machine learning [2]- [5], [9] for the collected feature data.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…These HTML tags consist of <link>, <object>, <form>, <script>, <embed>, <ilayer>, <layer>, <style>, <applet>, <meta>, <img>, <iframe>, and many more. For instance, the XSS worm "Samy" infected MySpace webpages by injecting a huge quantity of XSS payload in the <div> tag of the webpages [20]. On the other hand, JavaScript language is used in a webpage for embedding tasks, but an attacker can misuse some methods on the embedded XSS payload such as exec(), fromCharCode(), eval, alert(), getElementsByTagName(), write(), unscape(), and escape() [21].…”
Section: ) Url Featuresmentioning
confidence: 99%
“…Xu et al [15] presented a web page classification algorithm-Link Information Categorization (LIC)-to solve the traditional classification algorithms based on the analysis of web content that cannot implement effective classification. In order to classify a web page as being benign or malicious, Hwang et al [16] designed the system of 14 basic and 16 extended features that heuristically combined two basic features into one extended feature in order to effectively distinguish benign and malicious pages.…”
Section: Related Workmentioning
confidence: 99%