2008
DOI: 10.1109/jsyst.2008.2009190
|View full text |Cite
|
Sign up to set email alerts
|

A Visual Tradeoff Space for Formal Verification and Validation Techniques

Abstract: Abstract-Numerous techniques exist for conducting computer-assisted formal verification and validation. The cost associated with these techniques varies, depending on factors such as ease of use, the effort required to construct correct requirement specifications for complex real-life properties, and the effort associated with instrumentation of the software under test. Likewise, existing techniques differ in their ability to effectively cover the system under test and its associated requirements. To aid softw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
4
2
2

Relationship

4
4

Authors

Journals

citations
Cited by 21 publications
(19 citation statements)
references
References 38 publications
(38 reference statements)
0
18
0
Order By: Relevance
“…They differ in their abilities to describe complex system behavior, their ease of use to produce correct specifications, and their effectiveness in verifying the target code. (See [5] for a discussion of the various tradeoffs between different formal verification and validation (FV&V) techniques.) In [6], Heimdahl and Leveson discussed the needs for a specification language to create requirements models that minimize the semantic distance (i.e., the amount of effort to translate from one model to another) between the system's requirements specification and the mental model of the system in the stakeholder's mind as well as the semantic distance between the system's requirements specification and its implementation.…”
Section: Figure 1 Ambiguous Requirement Interpretationsmentioning
confidence: 99%
“…They differ in their abilities to describe complex system behavior, their ease of use to produce correct specifications, and their effectiveness in verifying the target code. (See [5] for a discussion of the various tradeoffs between different formal verification and validation (FV&V) techniques.) In [6], Heimdahl and Leveson discussed the needs for a specification language to create requirements models that minimize the semantic distance (i.e., the amount of effort to translate from one model to another) between the system's requirements specification and the mental model of the system in the stakeholder's mind as well as the semantic distance between the system's requirements specification and its implementation.…”
Section: Figure 1 Ambiguous Requirement Interpretationsmentioning
confidence: 99%
“…They also differ in descriptive power: PLTL is strictly sub-regular and, therefore, weaker than any finite state machine formalism; hence enters Regular LTL [14], which combines PLTL with regular expressions. Nevertheless, as described by the V&V tradeoff cuboid of [10], some formal verification techniques use weaker FS languages to achieve greater verification coverage. Section 2 overviews this tradeoff cuboid in greater detail.…”
Section: Introductionmentioning
confidence: 99%
“…[1,2] describe an approach Adaptive Runtime Verification (ARV), where overhead control, runtime verification with state estimation, and predictive analysis are all combined. This technique uses HMMs in the loop, as we do, but differs from our approach in that: (i) it is tailored for state estimation, The coverage tradeoff cuboid [10] while our approach performs RM on general log-file, and (ii) this technique has no accompanying rule library, online service, or visual audit. In fact, their technique is similar to an earlier technique published by the author [8], where formal specifications and HMMs in the loop are used to monitor a Kalman Filter.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In other words, it is positioned as a hybrid between formal specification and run-time verification techniques (e.g., [D1,D2,DMS]) and statistical pattern detection. Section 8 addresses the possibility of extending our approach to utilize some of the above mentioned statistical techniques such as long-term memory, fat tailed distributions, and various other fractal properties.…”
Section: Introductionmentioning
confidence: 99%