2005
DOI: 10.1007/s00224-005-1233-3
|View full text |Cite
|
Sign up to set email alerts
|

Query Learning of Regular Tree Languages: How to Avoid Dead States

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2007
2007
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 21 publications
(22 citation statements)
references
References 17 publications
0
22
0
Order By: Relevance
“…This process terminates when the equivalence has converged to the Myhill-Nerode equivalence. The learners in [12,13,19] are all very similar. Following Angluin's original approach, they maintain an observation table, whose rows are indexed by trees representing the states and transitions of an automaton.…”
Section: Introductionmentioning
confidence: 89%
See 2 more Smart Citations
“…This process terminates when the equivalence has converged to the Myhill-Nerode equivalence. The learners in [12,13,19] are all very similar. Following Angluin's original approach, they maintain an observation table, whose rows are indexed by trees representing the states and transitions of an automaton.…”
Section: Introductionmentioning
confidence: 89%
“…Angluin's original model was designed for learning ordinary deterministic finite automata (dfa). Its first generalization to bottom-up tree automata was proposed by Sakakibara in [21] and later improved in [12].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The point is that several decidability questions and related tasks can become significantly harder for trees as compared to strings. For instance, useless states pose no problem for string processing finite automata, but they do so for trees; see [67]. Also, there are certain intricacies that make a simple translation from the string to the tree case sometimes an error-prone task, as certified by the learner for representative samples and membership queries as discussed in [24,108].…”
Section: Learning Within or Beyond Regularity?mentioning
confidence: 98%
“…He considered skeletal languages, that is, the parse trees of context-free languages in which all internal nodes are unlabeled [131]. Drewes and Högberg considered regular tree languages in general [66], and proved that it is possible to avoid useless states in the inference process [67]. By doing so, the automaton A becomes a language-equivalent partial automaton with potentially exponentially fewer states.…”
Section: The Minimal Adequate Teacher Model Matmentioning
confidence: 98%