2004
DOI: 10.1007/978-3-540-24663-3_2
|View full text |Cite
|
Sign up to set email alerts
|

Computer Virus Propagation Models

Abstract: The availability of reliable models of computer virus propagation would prove useful in a number of ways, in order both to predict future threats, and to develop new containment measures. In this paper, we review the most popular models of virus propagation, analyzing the underlying assumptions of each of them, their strengths and their weaknesses. We also introduce a new model, which extends the Random Constant Spread modeling technique, allowing us to draw some conclusions about the behavior of the Internet … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
63
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 78 publications
(64 citation statements)
references
References 13 publications
1
63
0
Order By: Relevance
“…After cascades occur, network efficiency E(G) will finally stay stable at a lower level. It is in compliance with the current findings [28,40], which further verifies the feasibility of the proposed nonlinear C-L model (2). sciENcE aNd tEchNology…”
Section: Results Analysissupporting
confidence: 90%
“…After cascades occur, network efficiency E(G) will finally stay stable at a lower level. It is in compliance with the current findings [28,40], which further verifies the feasibility of the proposed nonlinear C-L model (2). sciENcE aNd tEchNology…”
Section: Results Analysissupporting
confidence: 90%
“…Early attempts capture the mechanism of random scan worms and use the simple epidemic model to study the initial part of worm spreading, where human countermeasures and congestions do not affect the propagation ( [7], [3]). In recent years, a number of deterministic models were designed to consider the parameters that affect the worm propagation, for random scanning (e.g., [7], [3], [9], [16], [13]), local preference (e.g., [11], [8], [12]) or other advanced strategies ( [5], [8], [4], [14], [6]). The twofactor model in [7] takes into account the congestion caused by the worm scan packets, as well as the reactive (human) countermeasures that turn infected or susceptible nodes into an immune state.…”
Section: Related Workmentioning
confidence: 99%
“…The twofactor model in [7] takes into account the congestion caused by the worm scan packets, as well as the reactive (human) countermeasures that turn infected or susceptible nodes into an immune state. Models that consider the preventive measures (e.g., antivirus and patch management [17]), link bandwidth between systems ( [9], [16], [13]), network topology [18], the slow down caused by automatic treatment and containment measures ( [19], [1], [11], [20]), infection delay and user vigilance [21], have also been proposed in the literature.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In a traditional IT environment it is a common practice to take a host offline in case it is suspected to be under attack [156,130]. This is done to limit the impact of the attack, and prevent a possible spread.…”
Section: Blocking or Flaggingmentioning
confidence: 99%