2017
DOI: 10.1016/j.cam.2016.06.015
|View full text |Cite
|
Sign up to set email alerts
|

Approximation of reachable sets using optimal control and support vector machines

Abstract: We propose and discuss a new computational method for the numerical approximation of reachable sets for nonlinear control systems. It is based on the support vector machine algorithm and represents the set approximation as a sublevel set of a function chosen in a reproducing kernel Hilbert space. In some sense, the method can be considered as an extension to the optimal control algorithm approach recently developed by Baier, Gerdts and Xausa. The convergence of the method is illustrated numerically for selecte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…One simple form this method can take is to store all experienced items in a representational space with a similarity gradient around each for use in future prediction or classification. This is exemplified in recent kernel methods, which use a variety of similarity metrics and often remove redundant stored items to provide a more efficient representation [1]; for example, support vector machines use kernel functions to draw decision boundaries between categories [2], often resulting in highly accurate classification performance [3][4][5].…”
Section: Spatial Methodsmentioning
confidence: 99%
“…One simple form this method can take is to store all experienced items in a representational space with a similarity gradient around each for use in future prediction or classification. This is exemplified in recent kernel methods, which use a variety of similarity metrics and often remove redundant stored items to provide a more efficient representation [1]; for example, support vector machines use kernel functions to draw decision boundaries between categories [2], often resulting in highly accurate classification performance [3][4][5].…”
Section: Spatial Methodsmentioning
confidence: 99%
“…Semantic matching extracts the corresponding semantic features and then performs semantic retrieval of user advertisements [18]. The semantic features of users and advertisements are extracted separately to calculate relevance.…”
Section: Related Workmentioning
confidence: 99%
“…5 and 6 show the scatter plot of iris measurements and the iris classification regions, respectively. Thus, it selects k nearest neighbors of x in the classification and places x into the class to which most of k neighbors belong [17], [18].…”
Section: Training Of Svm Classifiermentioning
confidence: 99%
“…, ( , ) with input ∈ ( -dimensional input space) and class labels (target output) ∈ {1, −1}. An SVM is employed to implement the classification [10]. This SVM constructs a hyperplane as a decision surface to maximize the margin of separation between positive and negative examples.…”
Section: Efd-based Algorithm and Svmmentioning
confidence: 99%