Component based development is gaining popularity in the software engineering community. The reliability of components affects the reliability of the system. Different models and theories have been developed to estimate system reliability given the information about system architecture and the quality of the components. Almost always in these models a key attribute of component-based systems, the error propagation between the components, is overlooked and not taken into account in the reliability prediction.We extend our previous work on Bayesian reliability prediction of component based systems by introducing the error propagation probability into the model. We demonstrate the impact of the error propagation in a case study of an automated Personnel Access Control System. We conclude that error propagation may have a significant impact on the system reliability prediction and, therefore, future architecture-based models should not ignore it.
We contrast the performance of three algorithms for the problem of deciding whether a Partially Clairvoyant real-time system with relative timing constraints, as specified in the E-T-C scheduling framework, has a feasible schedule. In the E-T-C scheduling model, real-time scheduling problems are specified through a specialized class of constraint logic programs (CLPs) called Quantified Linear Programs (QLPs) , An analysis of quantified linear programs. In: C.S. Calude (Ed.) thus algorithms for determining the schedulability of instances are procedures to determine the satisfiability of CLPs. Two of these algorithms, viz., the primal algorithm and the dual algorithm have already been discussed in the literature, while a third algorithm called the randomized dual algorithm has been recently proposed , A new verification procedure for partially Clairvoyant scheduling. Out of order quantifier elimination for standard quantified linear programs, Journal of Symbolic Computation, 40, 1383Computation, 40, -1396. Our experiments demonstrate that the dual-based algorithms (i.e. the dual and the randomized dual) are more effective from an implementational perspective; this is surprising since all three algorithms have the same worst case asymptotic complexity.
Formal methods for verification of software systems often face the problem of state explosion and complexity. We present a divide and conquer methodology that leads to component based verification and analysis of formal requirements specifications expressed using Software Cost Reduction (SCR) models. The proposed methodology has the following steps: model partitioning, partition verification (either by model checking or testing) and composition of verification results. We define a novel decomposition methodology for SCR specifications based on min-cut graph algorithms. The methodology identifies components in given SCR specifications and automates related abstraction methods. It also provides guidance on how to perform verification and validation of the formal system models. Further, we present a strategy for verification of modular and decomposable software models. Efficient verification of SCR models is achieved with the use of invariants and proof compositions. SCR specifications can be executed by the simulator and tested, either automatically (e.g. random testing) or manually, guided by a domain expert using visual interface mimicking the actual system. Some of the identified specification components might be simple enough to allow thorough testing, despite having large state spaces that cause problems with model checking approaches. We define model test coverage measures and develop tools to track the achieved coverage during manual and random testing. Testing and coverage measures provide degree of assurance in the component correctness. Our experimental results also provide insight into the efficacy of random testing approaches for verification of software models. Experimental validation of our methodology brought to light several concepts that have been advocated in the software development community for a long time: modularity, encapsulation, information hiding and the avoidance of global variables. The advantages of the compositional verification strategy are demonstrated in the case study, which analyses the Personnel Access Control System. Our approach offers significant savings in terms of time and memory requirements needed to perform formal system verification. Many people need to take credit for making this work possible. First of all, I would like to thank my advisor Dr. Bojan Cukic for his more than generous support during the years I worked on this dissertation. His unlimited wisdom, cheerful comments, academic excellence, have inspired me during the difficult days and showed me the right way in pursuing my interests. His guidance and education given to me is priceless. I would like to thank Constance Heitmeyer for the wonderful internships I have spent at the Naval Research Laboratory. The time and discussions with her on SCR, Requirements and Software Engineering, and every other topic were invaluable. At NRL I had chance to meet and enjoy the company of Ramesh Bharadwaj, Myla Archer,
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.