2011 IEEE International Symposium on Parallel and Distributed Processing Workshops and PHD Forum 2011
DOI: 10.1109/ipdps.2011.244
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Load Generation in Virtualized Environments for Software Performance Testing

Abstract: Abstract-Before placing a software system into production, it is necessary to guarantee it provides users with a certain level of Quality-of-Service. Intensive performance testing is then necessary to achieve such a level and the tests require an isolated computing environment. Virtualization can therefore play an important role for saving energy costs by reducing the number of servers required to run performance tests and for allowing performance isolation when executing multiple tests in the same computing i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 15 publications
(19 reference statements)
0
4
0
1
Order By: Relevance
“…Netto et al [155] and White et al [156] evaluate the stability of the generated load under virtualized environments (e.g., virtual machines). They find that the system throughput sometimes might not produce stable load on virtual machines.…”
Section: Configuring the Test Environmentmentioning
confidence: 99%
“…Netto et al [155] and White et al [156] evaluate the stability of the generated load under virtualized environments (e.g., virtual machines). They find that the system throughput sometimes might not produce stable load on virtual machines.…”
Section: Configuring the Test Environmentmentioning
confidence: 99%
“…Для решения поставленной задачи разработаем инфраструктуру среды исследований с применением технологий виртуализации [20].…”
Section: инфраструктура среды исследованийunclassified
“…Costa et al [9] summarized some bad practices of writing microbenchmarks using the JMH framework to mitigate the variation and instability of cloud environments when conducting performance microbenchmarking. Arif et al [4] and Netto et al [43] compared performance metrics generated via performance tests between virtual and physical environments. Their findings highlight the inconsistency between performance testing results in virtual and physical environments.…”
Section: Performance Variability Of Virtual Machinesmentioning
confidence: 99%