Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Jedlitschka and Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed. We used a reading method inspired by perspective-based and checklist-based reviews to perform a theoretical evaluation of the guidelines. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the reviews were based on a set of questions derived by brainstorming. A separate review was performed for each perspective. The review using the Author perspective considered each section of the guidelines sequentially. The reviews detected 44 issues where the guidelines would benefit from amendment or clarification and 8 defects. Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. The current guidelines need to be revised and then subjected to further theoretical and empirical validation. Perspective-based checklists are a useful validation method but the practitioner/consultant perspective presents difficulties
Because requirements engineering is recognized as critical to successful software projects we surveyed a number of software practitioners regarding their software development practices during recent software projects. Relationships between requirements practices and software project outcomes enable us to better understand requirements issues and their relationship with project success. We asked three sets of questions directly related to requirements issues: 1) requirements practices, 2) the sponsor and customers/users, and 3) project management. Our respondents were from business organizations in the U.S. and Australia, and were almost exclusively involved in in-house software development. The most significant factors from each question set were: 1) the requirements were good, 2) there was a high level of Customer/User involvement, and 3) the requirements were managed effectively. Overall, the best predictor of project success was that the requirements were good together with the requirements were managed effectively (93% of projects were predicted correctly). Our survey shows that effective project management is fundamental to effective requirements engineering.
We present an integrated approach to requirements engineering for organizational IT to help ensure IT-business strategy alignment. A single, unified model to enable validation of system requirements against business strategy is proposed. We use VMOST analysis to deconstruct business strategy. We then model strategy using a goal-oriented requirements engineering notation; this is done within the framework for modeling an organization's business strategy proposed by the Business Rules Group. We use Jackson's problem frames to represent business model context. Our approach is illustrated via an e-business case study of SevenEleven Japan taken from the literature.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.