2023
DOI: 10.1155/2023/9544481
|View full text |Cite
|
Sign up to set email alerts
|

Feed-Forward Deep Neural Network (FFDNN)-Based Deep Features for Static Malware Detection

Abstract: The portable executable header (PEH) information is commonly used as a feature for malware detection systems to train and validate machine learning (ML) or deep learning (DL) classifiers. We propose to extract the deep features from the PEH information through hidden layers of a feed-forward deep neural network (FFDNN). The extraction of deep features of hidden layers represents the dataset with a better generalization for malware detection. While feeding the deep feature of one hidden layer to the succeeding … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 58 publications
(63 reference statements)
0
1
0
Order By: Relevance
“…The choice of two dense layers with Rectified Linear Unit (ReLU) activation function ( Agarap, 2018 ), a learning rate of 10 –3 over 800 epochs, and the Adam as optimizer ( Kingma and Ba, 2015 ) was opted due to the ability of DL to complex information through multiple layers ( Bebis and Georgiopoulos, 1994 ). In addition to FFNN, we utilized various machine learning classifiers for comparative analysis following the previous studies ( Singh et al, 2023 ; Saxe and Berlin, 2015 ). AdaBoost, an ensemble learning method, was chosen for its ability to combine weak classifiers, such as decision trees, to form a robust classifier ( Freund and Schapire, 1997 ).…”
Section: Methodsmentioning
confidence: 99%
“…The choice of two dense layers with Rectified Linear Unit (ReLU) activation function ( Agarap, 2018 ), a learning rate of 10 –3 over 800 epochs, and the Adam as optimizer ( Kingma and Ba, 2015 ) was opted due to the ability of DL to complex information through multiple layers ( Bebis and Georgiopoulos, 1994 ). In addition to FFNN, we utilized various machine learning classifiers for comparative analysis following the previous studies ( Singh et al, 2023 ; Saxe and Berlin, 2015 ). AdaBoost, an ensemble learning method, was chosen for its ability to combine weak classifiers, such as decision trees, to form a robust classifier ( Freund and Schapire, 1997 ).…”
Section: Methodsmentioning
confidence: 99%