2009
DOI: 10.1007/s10111-009-0134-7
|View full text |Cite
|
Sign up to set email alerts
|

Principles of adjustable autonomy: a framework for resilient human–machine cooperation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 55 publications
(25 citation statements)
references
References 11 publications
0
25
0
Order By: Relevance
“…(p. 88) Furniss et al as well as Benn, Healey and Hollnagel locate the risk in the system environment, such as poor design, systems and processes [48] or as a "range of risks to safety posed by close proximity to potentially hazardous processes, medicines and equipment" [49] in a patient's journey through the healthcare system. Zieba et al [50] have a joint cognitive theory-heritage in their location of risk in the complex interactions between human-machine agents. Bruyelle et al [51] also introduce resilience as a concept able to mitigate the risks of antagonistic threats.…”
Section: Riskmentioning
confidence: 99%
“…(p. 88) Furniss et al as well as Benn, Healey and Hollnagel locate the risk in the system environment, such as poor design, systems and processes [48] or as a "range of risks to safety posed by close proximity to potentially hazardous processes, medicines and equipment" [49] in a patient's journey through the healthcare system. Zieba et al [50] have a joint cognitive theory-heritage in their location of risk in the complex interactions between human-machine agents. Bruyelle et al [51] also introduce resilience as a concept able to mitigate the risks of antagonistic threats.…”
Section: Riskmentioning
confidence: 99%
“…On the other hand, if this treatment produces serial other dissonances and may fail, then it contributes to the vulnerability of the controlled system. The frequency of perturbations such as dissonances may have an impact of the system resilience or vulnerability (Westrum 2006;Zieba et al 2010). The management of a regular dissonance increases knowledge about it and may converge to a high stable knowledge level, whereas a new dissonance can provoke instability that needs to modify, refine or create knowledge.…”
Section: Dissonance Control and Knowledge Reinforcementmentioning
confidence: 99%
“…Then, dispositional dissonance relates to opposite knowledge about the same facts, epistemic dissonance concerns different beliefs about the sources of knowledge, and ontological dissonance is different or opposite meanings of the same knowledge (Hunter and Summerton 2006). The last example of dissonance concerns the affordances that are based on relations between objects and possible new actions by using these objects (Gibson 1986;Zieba et al 2010). Therefore, the dissonance discovery process consists in creating new relationships between objects and actions and this process can concern several groups of users.…”
Section: Taxonomy Of Dissonancesmentioning
confidence: 99%
See 1 more Smart Citation
“…More recently, resilience-or vulnerability-based methods consider the analysis of the success or the failure of the recovery control of the system stability, respectively (Hollnagel et al 2006;Zieba et al 2010;Ouedraogo et al 2013). These approaches aim at identifying the technical, human and organisational factors that make a system resilient or vulnerable in the face of particular situations such as unpredictable or unprecedented events.…”
Section: What Are the Future Challenges For Risk Analysis?mentioning
confidence: 99%