“…Although the development of certification test methods and criteria is limited solely to the applied product, it is never easy to complete the process (e.g., draft, discuss, revise, and finalize) in such a short time. Moreover, the process for verification that usually requires a multiple-staged test or experiment should also be followed to confirm the suitability of the test methods and thresholds developed [21]. Hence, a more efficient way for the whole process of the compliance certification has been constantly explored.…”
Section: Verification Of Test Methods and Certification Criteriamentioning
The recent industrial transformation driven by industry 4.0 and related emerging technologies, such as big data and Internet of Things (IoT), have exerted significant impacts on the key aspects of new product development. As an example, new smart products are being developed in various ways through technological convergence. The convergence-oriented development of new products, however, may increase the difficulty of getting tested and certified due to the incompatibility with the existing certification standards. Additionally, it may take several years to develop and verify new standards, possibly delaying the product launch. To deal with this trouble, this study proposes an interesting idea that living labs can be effectively used in verifying the test methods and certification criteria for new technological convergence products (NCPs). Specifically, this study develops a framework for analyzing the real-use environment of NCPs and deriving the primary living lab components that should be implemented for verification purposes. The presented framework is then applied to real NCP cases to check its validity. To the best of my knowledge, this study is among the first elaborating the idea of using living labs as the methodology for verifying the test methods and certification criteria for NCPs. The specific methods developed in this study can be used as important guidelines that practitioners can follow step by step. Theoretically, this study also has significant implications for living lab literature and context-related research.
“…Although the development of certification test methods and criteria is limited solely to the applied product, it is never easy to complete the process (e.g., draft, discuss, revise, and finalize) in such a short time. Moreover, the process for verification that usually requires a multiple-staged test or experiment should also be followed to confirm the suitability of the test methods and thresholds developed [21]. Hence, a more efficient way for the whole process of the compliance certification has been constantly explored.…”
Section: Verification Of Test Methods and Certification Criteriamentioning
The recent industrial transformation driven by industry 4.0 and related emerging technologies, such as big data and Internet of Things (IoT), have exerted significant impacts on the key aspects of new product development. As an example, new smart products are being developed in various ways through technological convergence. The convergence-oriented development of new products, however, may increase the difficulty of getting tested and certified due to the incompatibility with the existing certification standards. Additionally, it may take several years to develop and verify new standards, possibly delaying the product launch. To deal with this trouble, this study proposes an interesting idea that living labs can be effectively used in verifying the test methods and certification criteria for new technological convergence products (NCPs). Specifically, this study develops a framework for analyzing the real-use environment of NCPs and deriving the primary living lab components that should be implemented for verification purposes. The presented framework is then applied to real NCP cases to check its validity. To the best of my knowledge, this study is among the first elaborating the idea of using living labs as the methodology for verifying the test methods and certification criteria for NCPs. The specific methods developed in this study can be used as important guidelines that practitioners can follow step by step. Theoretically, this study also has significant implications for living lab literature and context-related research.
“…One of the initial efforts to provide a reference metamodel to deal with reducing the ambiguity behind regulations, and standards compliance-related activities was the CCL (Common Certification Language) [9]. It was proven useful to model safety standards such as the EN 50128 standard for railway domain, the DO-178 for avionics [10], or ISO 26262 for automotive [11]. CCL evolved into the CACM (Common Assurance and Certification Meta model) [5] to be more expressive in the connection with the actual system architecture [12].…”
Section: Modelling Reference Framework and Equivalence Mapsmentioning
To assure certain critical quality properties (e.g., safety, security, or privacy), supervisory authorities and industrial associations provide reference frameworks such as standards or guidelines that in some cases are enforced (e.g., regulations). Given the pace at which both technical advancements and risks appear, there is an increase in the number of reference frameworks. As several frameworks might apply for same systems, certain overlaps appear (e.g., regulations for different countries where the system will operate, or generic standards in conjunction with more concrete standards for a given industrial sector or system type). We propose the use of modelling for alleviating the complexity of these reference frameworks ecosystems, and we provide a tool-supported method to create them for the benefit of different stakeholders. The case study is based on privacy data protection, and more concretely on privacy impact assessment processes. The European GDPR regulates the movement and processing of personal data, and, contrary to available software engineering privacy guidelines, articles in legal texts are usually difficult to translate to the underlying processes, artefacts and roles that they refer to. To facilitate the mutual comprehension of legal experts and engineers, in this work we investigate how mappings can be created between these two domains of expertise. Notably, we rely on modelling as a central point. We modelled the legal requirements of the GDPR on data protection impact assessments, and then, we selected the ISO/IEC 29134, a mainstream engineering guideline for privacy impact assessment, and, taking a concrete sector as example, the EU Smart Grid Data Protection Impact Assessment template. The OpenCert tool was used for providing technical support to both the modelling and the creation of the mapping models in a systematic way. We provide a qualitative evaluation from legal experts and privacy engineering practitioners to report on the benefits and limitations of this approach.
“…They estimate the use of a new technical reference architecture for reducing safety certification cost, but without a substantial change in the underlying life-cycle activities. An example where life-cycle activities are introduced can be found in [15] where cost is estimated in several use cases where advanced safety assurance practices are introduced. Their estimations are based on direct values from expert judgement (i.e., interviews).…”
Safety and Security concerns are usually interlinked while building critical software-intensive systems of systems. Several efforts try to approach both domains of expertise to increase the overall reliability of the systems and reduce costs by an earlier detection of issues and trade-offs. Despite the growing number of co-engineering practices at different life-cycle stages, there is a lack on business justifications such as economic costs of their adoption. We report on using a cost model to evaluate the convenience (or not) of adopting co-engineering practices in two industrial case studies (space and medical devices). Simulation results with the collected data suggest an improvement in quality if any of the selected co-engineering practices are integrated while cost increases in one case but reduces in the other. We discuss the results but, as they cannot be generalized, the main contribution is on proposing the cost model for answering the title's question.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.