Proceedings 16th Annual International Conference on Automated Software Engineering (ASE 2001)
DOI: 10.1109/ase.2001.989805
|View full text |Cite
|
Sign up to set email alerts
|

Generation of distributed system test-beds from high-level software architecture descriptions

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
30
0

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(30 citation statements)
references
References 15 publications
0
30
0
Order By: Relevance
“…Besides validating a particular methodology, the question of prediction precision is tackled from many other angles, including workload characterization [29,16], or generation of either the executable system [31,13,33,6] or the performance model [15,19]. What these approaches have in common is that they still require Self-archived copy.…”
Section: Related Workmentioning
confidence: 99%
“…Besides validating a particular methodology, the question of prediction precision is tackled from many other angles, including workload characterization [29,16], or generation of either the executable system [31,13,33,6] or the performance model [15,19]. What these approaches have in common is that they still require Self-archived copy.…”
Section: Related Workmentioning
confidence: 99%
“…The SoftArch/MTE [20] tool provides a framework for system architects to provide higher level abstraction of the system specifying system characteristics such as middleware, database technology, and client requests. The tool then generates a implementation of the system along with the performance tests that measure system characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…Some approaches use extracted component meta-data and wrappers to formulate automated component tests [26,21], but these techniques tend to only focus on functional component validation, not validating non-functional constraints. Middleware performance testing work typically focuses on only limited aspects of component functional and non-functional requirements [9,22,7]. In addition, many of these approaches also suffer from a high degree of manual effort on the part of developers to build test harnesses with which to evaluate their components and middleware [5,7,9,14].…”
Section: Motivationmentioning
confidence: 99%
“…Middleware performance testing work typically focuses on only limited aspects of component functional and non-functional requirements [9,22,7]. In addition, many of these approaches also suffer from a high degree of manual effort on the part of developers to build test harnesses with which to evaluate their components and middleware [5,7,9,14]. Most current deployed component validation approaches also typically require extensive test bed prototyping to evaluate components [16,9].…”
Section: Motivationmentioning
confidence: 99%
See 1 more Smart Citation