2006
DOI: 10.1016/j.ejor.2005.07.006
|View full text |Cite
|
Sign up to set email alerts
|

An interpretive approach to evaluating information systems: A content, context, process framework

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
108
0
2

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 119 publications
(110 citation statements)
references
References 51 publications
(96 reference statements)
0
108
0
2
Order By: Relevance
“…4). The contributions reflect learning from successes and mistakes of developing, instantiating and evaluating the ensemble artifact of performance indicators as content, change context description, and the assessment process including application of indicators and the use of the resulting measures [18]. …”
Section: Table 2 Design Principlesmentioning
confidence: 99%
See 1 more Smart Citation
“…4). The contributions reflect learning from successes and mistakes of developing, instantiating and evaluating the ensemble artifact of performance indicators as content, change context description, and the assessment process including application of indicators and the use of the resulting measures [18]. …”
Section: Table 2 Design Principlesmentioning
confidence: 99%
“…processes, services, business units) can be described by performance indicators. Our empirical research is centered on an ensemble artifact of performance indicators (content), description of the eGovernment initiative (context), and their use in the assessment (process) of effects from eGovernment initiative [18]. eGovernment indicator s material and organizationa technology-based designs, innovation where performa policies and work practic 22].Following the principl ingrained research for the r theory in the artifact which efforts.…”
Section: A Theory-ingrained Artifactmentioning
confidence: 99%
“…and 'when'); and (3) Content ('what') (Stockdale & Standing, 2006). The effectiveness and performance evaluation of a design framework such as this should 218 be carried out also using semi-structured interviews of designers, including their formal evaluation, adhering to appropriate comparison criteria (Song & Sakao, 2017).…”
Section: Evaluation Phasementioning
confidence: 99%
“…Second, a framework to investigate equivocal situations in practice is formed by linking literature from IS/IT evaluation and IS/IT project continuation decisions. Third, the causes of equivocality when evaluating and deciding the continuation of on-going IS/IT projects are investigated using extensive interviews with experts based on the framework drawn from Stockdale and Standing (2006), an extension of the content, context, and process (CCP) framework and of Goldkuhl and Lagsten's (2012) conceptual practice model of evaluation (CPME). Insights on equivocal situations are gleaned from the perspectives of different decision-makers or stakeholders.…”
Section: Why Do Equivocal Situations Occur When Evaluating and Decidimentioning
confidence: 99%
“…These frameworks suggest evaluation is a practice and a process to acquire additional knowledge related to an object that consists of several interconnected elements, for instance the evaluation criteria and evaluation procedure, within the context, content and process (Goldkuhl and Lagsten 2012). The broad view of these frameworks provides useful insights into the constituents of evaluation and the guidelines to establish key issues during evaluation (Stockdale and Standing 2006). We draw on these two and derive a new framework that pulls together the characteristics of equivocal situations, evaluation frames and their corresponding causes of equivocality, as depicted in Figure 6.…”
mentioning
confidence: 99%