2004
DOI: 10.3127/ajis.v12i1.104
|View full text |Cite
|
Sign up to set email alerts
|

On Understanding Evaluation of Tool Support for IS Development

Abstract: Evaluation of IS tools has received considerable attention in the literature, yet no consensus is apparent on characterising the important dimensions of an evaluation process. From an analysis of the literature, we identify a weakness in extant methods to support evaluation activities, and posit a framework for IS tool evaluation. We conclude that there is a need for enriched method support to deal with the complex socio-technical issues involved in the summative assessment of IS tools.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2005
2005
2011
2011

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 51 publications
(53 reference statements)
0
3
0
Order By: Relevance
“…Our specific assessment of the health for two Open Source communities may have broader implications for evaluation and assessment of software systems. Evaluation of software systems has many dimensions and it is widely acknowledged to be a complex activity (Lundell and Lings 2004). Our specific strategy used for assessing health of Open Source ecosystems has certain similarities with, and may contribute to, previously proposed approaches (e.g.…”
Section: Discussionmentioning
confidence: 89%
See 1 more Smart Citation
“…Our specific assessment of the health for two Open Source communities may have broader implications for evaluation and assessment of software systems. Evaluation of software systems has many dimensions and it is widely acknowledged to be a complex activity (Lundell and Lings 2004). Our specific strategy used for assessing health of Open Source ecosystems has certain similarities with, and may contribute to, previously proposed approaches (e.g.…”
Section: Discussionmentioning
confidence: 89%
“…OpenBRR, QSOS, OMM) for evaluation and assessment of Open Source projects (Petrinja et al 2010). Common to all these are that they are based on an a-priori evaluation framework (Lundell and Lings 2004). For example, QSOS features an intrinsic durability category which includes metrics such as activity on releases and number of developers.…”
Section: Discussionmentioning
confidence: 99%
“…2. See Lundell and Lings (2004b) for an elaboration of the role of an evaluation framework in an evaluation activity. 3.…”
Section: Notesmentioning
confidence: 99%