2020
DOI: 10.26686/wgtn.13490913.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards a framework for certification of reliable autonomous systems

Abstract: A computational system is called autonomous if it is able to make its own decisions, or take its own actions, without human supervision or control. The capability and spread of such systems have reached the point where they are beginning to touch much of everyday life. However, regulators grapple with how to deal with autonomous systems, for example how could we certify an Unmanned Aerial System for autonomous use in civilian airspace? We here analyse what is needed in order to provide verified reliable behavi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 105 publications
(5 reference statements)
0
4
0
Order By: Relevance
“…The above-mentioned standards do not specifically address autonomy, such as adaptations to regulations for certification of aerospace systems incorporating different levels of autonomy [6]. However, in the context of autonomy, artificial intelligence, and machine learning in particular, EASA has recently published new guidance documents [7][8][9][10].…”
Section: Related Work: Aerospace Regulationsmentioning
confidence: 99%
See 1 more Smart Citation
“…The above-mentioned standards do not specifically address autonomy, such as adaptations to regulations for certification of aerospace systems incorporating different levels of autonomy [6]. However, in the context of autonomy, artificial intelligence, and machine learning in particular, EASA has recently published new guidance documents [7][8][9][10].…”
Section: Related Work: Aerospace Regulationsmentioning
confidence: 99%
“…From a certification perspective, the aircraft is not safe. However, the operation may be safe if flights only occur in unpopulated areas, so the operation is therefore at an even higher level of abstraction than the aircraft.The above-mentioned standards do not specifically address autonomy, such as adaptations to regulations for certification of aerospace systems incorporating different levels of autonomy [6]. However, in the context of autonomy, artificial intelligence, and machine learning in particular, EASA has recently published new guidance documents [7][8][9][10].…”
mentioning
confidence: 99%
“…It is worth mentioning that CPS are often associated with the concepts of distributed embedded systems and smart systems, with which they share aspects of complexity, autonomy and criticality [17]. The interest in resilient CPS has grown in the last years as witnessed by the numerous projects and publications on related topics [7].…”
Section: From Cyber-physical System Resilience To Trustworthy Autonomymentioning
confidence: 99%
“…While predictability was the key for the assessment of legacy systems such as trains or airplanes, nowadays there is a paradigm shift toward smarter systems based on artificial intelligence (AI), which have the advantage of learning and adapting to new situations; however, those characteristics also introduce a high level of uncertainty that complicates their analysis and certification [7]. Therefore, we are witnessing an apparent paradox where systems have the potential of becoming more dependable due to their higher intelligence, but that can also reduce the level of trust we hold in those systems.…”
Section: Introductionmentioning
confidence: 99%