2002
DOI: 10.1016/s0951-8320(01)00092-8
|View full text |Cite
|
Sign up to set email alerts
|

Using model checking to help discover mode confusions and other automation surprises

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
92
0
1

Year Published

2005
2005
2017
2017

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 157 publications
(95 citation statements)
references
References 7 publications
2
92
0
1
Order By: Relevance
“…Automation surprises, difficult decisional compromises between alternatives, or barrier removals are other examples of inconsistency. Automation surprise is a conflict of intention between an automated system and its user (Rushby 2002;Inagaki 2008), which can occur as a result of a number of factors, one of which is the lack of transparency. Relaxing safety constraints can lead to the discovery of new alternative action plans (Ben Yahia et al 2015), or to the discovery of the best compromise between performance criteria (Chen et al 2014).…”
Section: Taxonomy Of Dissonancesmentioning
confidence: 99%
“…Automation surprises, difficult decisional compromises between alternatives, or barrier removals are other examples of inconsistency. Automation surprise is a conflict of intention between an automated system and its user (Rushby 2002;Inagaki 2008), which can occur as a result of a number of factors, one of which is the lack of transparency. Relaxing safety constraints can lead to the discovery of new alternative action plans (Ben Yahia et al 2015), or to the discovery of the best compromise between performance criteria (Chen et al 2014).…”
Section: Taxonomy Of Dissonancesmentioning
confidence: 99%
“…Another approach is to encode assumptions about the user directly into the model (cf. [14]). In this case the separation between device model and user assumptions is not clear and can bias the user assumptions towards those that are needed to make the system work.…”
Section: Discussionmentioning
confidence: 99%
“…Trace resourced = Action resourced * (13) Action resourced = (Action user × PResource) + Action system (14) Hence, for each user action we can include a set of needed resources. This information is not available at the original CTT model, and must be provided in order for the analysis to take place.…”
Section: Tasks and Resourcesmentioning
confidence: 99%
See 1 more Smart Citation
“…John Rushby and his colleagues (Crow, Javaux & Rushby, 2000;Rushby, 2001Rushby, , 2002 analyzed human-automation interaction, demonstrating the use of theorem-provers and model-checkers to explain deviations of pilots' mental models from correct models of autopilot behavior. They showed how formal methods could be used in a cycle of analysis, re-design, and re-analysis to improve a human-machine system.…”
Section: Related Workmentioning
confidence: 99%