Concurrent applications are frequently written, however, there are no systematic approaches for testing them from requirements descriptions. Methods for sequential applications are inadequate to validate the reliability of concurrent applications and they are expensive and time consuming. So, it is desired that test cases can be automatically generated from requirements descriptions. This paper proposes an automated approach to generate test cases for concurrent applications from requirements descriptions. The Scenario language is the representation used for these descriptions. Scenario describes specific situations of the application through a sequence of episodes, episodes execute tasks and some tasks can be executed concurrently; these descriptions reference relevant words or phrases (shared resources), the lexicon of an application. In this process, for each scenario a directed graph is derived, and this graph is represented as an UML activity diagram. Because of multiple interactions among concurrent tasks, test scenario explosion becomes a major problem. This explosion is controlled adopting the interaction sequences and exclusive paths strategies. Demonstration of the feasibility of the proposed approach is based on two case studies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.