Abstract. The Agile Web Engineering (AWE) Process is an agile or lightweight process that has been created to tackle the challenges that have been identified in Web engineering: short development life-cycle times; multidisciplinary development teams; delivery of bespoke solutions comprising software and data. AWE helps teams identify and manage the interactions between the business, domain, software and creative design strands in Web engineering projects. This paper gives an overview of the wide diversity of stakeholder roles reflected within AWE and how AWE tries to ensure communication between multidisciplinary sub-teams on large Web engineering projects.
Most research into persistent programming has been directed towards the design and implementation of languages and object stores. There are few reports on the characteristics of systems exploiting such technology. This paper reports on a study of the source code of 20 applications consisting of more than 108,000 lines of persistent language code. The authors of the applications range from students to experienced programmers. The programs have been categorised and examined with respect to a persistent application model and the extent of inconsistencies relative to this model is presented. The results confirm the need for and give input to the design of programming methodologies and tools for persistent software engineering. Measurements also include the use of names, types, (polymorphic) procedures and persistent bindings. It is hoped that analysis of the measurements will be used as input to the next generation of languages and programming environments. As part of this new generation, a measurements system is outlined operating entirely within the persistent environment, thus simplifying access to and measurement of both static and dynamic information.
Objective: To improve the qualitative data obtained from software engineering experiments by gathering feedback during experiments. Rationale: Existing techniques for collecting quantitative and qualitative data from software engineering experiments do not provide sufficient information to validate or explain all our results. Therefore, we would like a cost-effective and unobtrusive method of collecting feedback from subjects during an experiment to augment other sources of data. Design of study: We formulated a set of qualitative questions that might be answered by collecting feedback during software engineering experiments. We then developed a tool to collect such feedback from experimental subjects. This feedback-collection tool was used in four different experiments and we evaluated the usefulness of the feedback obtained in the context of each experiment. The feedback data was triangulated with other sources of quantitative and qualitative data collected for the experiments. Results: We have demonstrated that the collection of feedback during experiments provides useful additional data to: validate the data obtained from other sources about solution times and quality of solutions; check process conformance; understand problem solving processes; identify problems with experiments; and understand subjects' perception of experiments. Conclusions: Feedback collection has proved useful in four experiments and we intend to use the feedback-collection tool in a range of other experiments to further explore the cost-effectiveness and limitations of this technique. It is also necessary to carry out a systematic study to more fully understand the impact of the feedback-collecting tool on subjects' performance in experiments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.