2009
DOI: 10.1007/978-3-642-02818-2_18
|View full text |Cite
|
Sign up to set email alerts
|

An Extensible Monitoring Framework for Measuring and Evaluating Tool Performance in a Service-Oriented Architecture

Abstract: Abstract. The lack of QoS attributes and their values is still one of the fundamental drawbacks of web service technology. Most approaches for modelling and monitoring QoS and web service performance focus either on client-side measurement and feedback of QoS attributes, or on ranking and discovery, developing extensions of the standard web service discovery models. However, in many cases, provider-side measurement can be of great additional value to aid the evaluation and selection of services and underlying … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 25 publications
0
9
0
Order By: Relevance
“…Both general Quality-of-Service (QoS) attributes such as performance, throughput, resource usage, etc., as well as specific QoS criteria such as accuracy need to be continuously monitored during operation to ensure that the used component actually keeps fulfilling the requirements as expected [9]. Any deviation in QoS of the level measured during the experiments is an indication of either an incomplete evaluation procedure, or a change in the environment that needs to be addressed, such as a sudden increase in data volume.…”
Section: Integration and Monitoringmentioning
confidence: 99%
See 2 more Smart Citations
“…Both general Quality-of-Service (QoS) attributes such as performance, throughput, resource usage, etc., as well as specific QoS criteria such as accuracy need to be continuously monitored during operation to ensure that the used component actually keeps fulfilling the requirements as expected [9]. Any deviation in QoS of the level measured during the experiments is an indication of either an incomplete evaluation procedure, or a change in the environment that needs to be addressed, such as a sudden increase in data volume.…”
Section: Integration and Monitoringmentioning
confidence: 99%
“…The candidate tools are applied to a previously defined set of sample objects, and the outcome is evaluated against the requirements, relying on automated measurements. That means that all considered components are executed in their respective deployment location inside a controlled environment that monitors their behaviour and collects data about their resource utilisation [9]. This meta-information on the service execution is delivered along with the component output and mapped semi-automatically to the specified requirements.…”
Section: Software Architecturementioning
confidence: 99%
See 1 more Smart Citation
“…The QoS values for each service and the branching probabilities of gateways in the BPMN model are defined in separate (text) files. The tool is distributed as an extension of the BPStruct tool 4 and is available at: http://sep.cs.ut.ee/Main/Bpstruct. Below we present an evaluation of the scalability of the QoS aggregation method using the implemented tool.…”
Section: Implementation and Evaluationmentioning
confidence: 99%
“…In particular, providers of composite services need to assess the expected quality of these services and to detect and act upon unexpected QoS variations [2,3,4].…”
Section: Introductionmentioning
confidence: 99%