2021
DOI: 10.1007/978-3-030-85672-4_20
|View full text |Cite
|
Sign up to set email alerts
|

Improved SAT Models for NFA Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…In [12], we proposed two strategies based on metaheuristics for optimizing models, and consequently for optimizing the split of each word of S. For both of them, the search space corresponds to all the hybrid models, i.e., all the possible split for all words of S. The fitness we used is:…”
Section: Previous Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…In [12], we proposed two strategies based on metaheuristics for optimizing models, and consequently for optimizing the split of each word of S. For both of them, the search space corresponds to all the hybrid models, i.e., all the possible split for all words of S. The fitness we used is:…”
Section: Previous Modelsmentioning
confidence: 99%
“…This task is called modeling, the result (variables and constraints) is a model, and a model together with some data forms an instance. In our case, the model is a NFA inference model, the data are S and k. For example, an INLP (Integer Non Linear Programming) model for inferring NFA is given in [10], in [2] a SAT (the propositional satisfiability problem [11]) model is given, and in [12], [1] several improvements of the SAT model are given.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation