“…Traceability Assessment is a technique to ensure that the modeling process accomplishes all the requirements and that the requirements match the design. Currently, there is some effort to systematize and automate these methods exists, like [63,64], but since this process is not fully automated and relies on subjective appreciations, we keep this technique informal. This will be similar to Interface Analysis, both for User Interface analysis [65] and for Model Interface Analysis [66], where some subjective approaches still must be made.…”
A simulation model, and more generically, a model, is founded on its assumptions. Assurance of the model’s correctness and correct use is needed to achieve accreditation. Often the exercise of working with a specific code misunderstands the overall process, focusing the resources on the model coding and forgetting the needed resources to ensure the validation of every step of the model definition and coding. The goal of this work is to present a methodology to help in the definition and use of the assumptions in the modeling process. To do so, we present a process to conduct a simulation project, an assumptions taxonomy, and a method that simplifies working with those assumptions. We propose to extend the traditional Validation, Verification, and Accreditation processes to a process composed of eight Validation, Verification, and Accreditation phases that cover the overall life cycle of a model. Although this paper is focused on a simulation model, we can extend the proposed method to a more general modeling approach.
“…Traceability Assessment is a technique to ensure that the modeling process accomplishes all the requirements and that the requirements match the design. Currently, there is some effort to systematize and automate these methods exists, like [63,64], but since this process is not fully automated and relies on subjective appreciations, we keep this technique informal. This will be similar to Interface Analysis, both for User Interface analysis [65] and for Model Interface Analysis [66], where some subjective approaches still must be made.…”
A simulation model, and more generically, a model, is founded on its assumptions. Assurance of the model’s correctness and correct use is needed to achieve accreditation. Often the exercise of working with a specific code misunderstands the overall process, focusing the resources on the model coding and forgetting the needed resources to ensure the validation of every step of the model definition and coding. The goal of this work is to present a methodology to help in the definition and use of the assumptions in the modeling process. To do so, we present a process to conduct a simulation project, an assumptions taxonomy, and a method that simplifies working with those assumptions. We propose to extend the traditional Validation, Verification, and Accreditation processes to a process composed of eight Validation, Verification, and Accreditation phases that cover the overall life cycle of a model. Although this paper is focused on a simulation model, we can extend the proposed method to a more general modeling approach.
“… Rempel & Mäder (2016) also focus on traceability difficulties, providing an assessment model and a comprehensive classification of possible traceability problems and assessment criteria for systematically detecting those problems.…”
Background
The benefits of requirements traceability, such as improvements in software product and process quality, early testing, and software maintenance, are widely described in the literature. Requirements traceability is a critical, widely accepted practice. However, very often it is not applied for fear of the additional costs associated with manual efforts or the use of additional tools.
Methods
This article presents a “low-cost” mechanism for automating requirements traceability based on the model-driven paradigm and formalized by a metamodel for the creation and monitoring of traces and an integration process for traceability management. This approach can also be useful for information fusion in industry insofar that it facilitates data traceability.
Results
This article extends an existing model-driven development methodology to incorporate traceability as part of its development tool. The tool has been used successfully by several companies in real software development projects, helping developers to manage ongoing changes in functional requirements. One of those projects is cited as an example in the paper. The authors’ current work leads them to conclude that a model-driven engineering approach, traditionally used only for the automatic generation of code in a software development process, can also be used to successfully automate and integrate traceability management without additional costs. The systematic evaluation of traceability management in industrial projects constitutes a promising area for future work.
“…Several researchers have proposed techniques for continuously assessing and maintaining software traceability [4]. EBT (Event Based Traceability) uses a publish-subscribe model to notify developers when trace links need to be updated [30] while Rempel et al, proposed an automated traceability assessment approach for continuously assessing the compliance of traceability to regulations in certified products [31], [32]. These approaches are orthogonal to our work as they are process-unaware, and hence provide little guidance for which step in a process a trace link must be available.…”
Regulations, standards, and guidelines for safetycritical systems stipulate stringent traceability but do not prescribe the corresponding, detailed software engineering process. Given the industrial practice of using only semi-formal notations to describe engineering processes, processes are rarely "executable" and developers have to spend significant manual effort in ensuring that they follow the steps mandated by quality assurance. The size and complexity of systems and regulations makes manual, timely feedback from Quality Assurance (QA) engineers infeasible. In this paper we propose a novel framework for tracking processes in the background, automatically checking QA constraints depending on process progress, and informing the developer of unfulfilled QA constraints. We evaluate our approach by applying it to two different case studies; one open source community system and a safety-critical system in the airtraffic control domain. Results from the analysis show that trace links are often corrected or completed after the fact and thus timely and automated constraint checking support has significant potential on reducing rework.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.