Proceedings of the 2008 International Symposium on Software Testing and Analysis 2008
DOI: 10.1145/1390630.1390648
|View full text |Cite
|
Sign up to set email alerts
|

Comparing software metrics tools

Abstract: This paper shows that existing software metric tools interpret and implement the definitions of object-oriented software metrics differently. This delivers tool-dependent metrics results and has even implications on the results of analyses based on these metrics results. In short, the metricsbased assessment of a software system and measures taken to improve its design differ considerably from tool to tool. To support our case, we conducted an experiment with a number of commercial and free metrics tools. We c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
81
0
2

Year Published

2010
2010
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 147 publications
(88 citation statements)
references
References 9 publications
(13 reference statements)
2
81
0
2
Order By: Relevance
“…We consider distances between purely discrete (nonprobabilistic, untimed) systems, and our distances are directed rather than symmetric (based on simulation rather than bisimulation). Software metrics measure properties such as lines of code, depth of inheritance (in an object-oriented language), number of bugs in a module or the time it took to discover the bugs (see for example [12,16]). These functions measure syntactic properties of the source code, and are fundamentally different from our distance functions that capture the difference in the behavior (semantics) of programs.…”
Section: Introductionmentioning
confidence: 99%
“…We consider distances between purely discrete (nonprobabilistic, untimed) systems, and our distances are directed rather than symmetric (based on simulation rather than bisimulation). Software metrics measure properties such as lines of code, depth of inheritance (in an object-oriented language), number of bugs in a module or the time it took to discover the bugs (see for example [12,16]). These functions measure syntactic properties of the source code, and are fundamentally different from our distance functions that capture the difference in the behavior (semantics) of programs.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, we have shown earlier that different metrics tools lead to different class level quality values for the same system(s) and the same quality model and even to different conclusions regarding the quality ranking of the classes [27].…”
Section: ) Variable Selectionmentioning
confidence: 98%
“…We use VizzAnalyzer for the metrics extraction, but other software metrics tools (cf. Lincke et al for an overview [27]) could have been used as well. The VizzAnalyzer was our choice since it supports automated processes: an interface (c) allows for batch processing of a list of projects and an export engine (d) to store the computed metrics in a database for later processing.…”
Section: Instrumentationmentioning
confidence: 99%
See 1 more Smart Citation
“…Static analysis tools that extract source measures can have different methods and approaches for quantifying certain properties (Lincke et al, 2008). As a result, they may produce inconsistent values for measures.…”
Section: Validity Threatsmentioning
confidence: 99%