1992
DOI: 10.1007/978-1-4613-9229-3_4
|View full text |Cite
|
Sign up to set email alerts
|

Complexity Issues in Robotic Machine Learning of Natural Language

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

1995
1995
2009
2009

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 3 publications
0
7
0
Order By: Relevance
“…The process of modelling child language acquisi tion is very complex, as many of the first at tempts confirmed (Feldman et al, 1990;Suppes, Liang & Bottner, 1991). Rather than modelling the process in entirety, an undoubtedly daunting task, modellers took the simplified approach of focusing upon individual linguistic behaviours, leading to much research into relatively con strained problems such as understanding over and undergeneralisation errors (Plunkett, Sinha, Moller & Strandsby, 1992), single word learning (Regier, 2005), syntactic category acquisition (Redington, Chater & Finch, 1988) and past tense learning (Rumelhart & Mcclelland, 1986).…”
Section: Introductionmentioning
confidence: 99%
“…The process of modelling child language acquisi tion is very complex, as many of the first at tempts confirmed (Feldman et al, 1990;Suppes, Liang & Bottner, 1991). Rather than modelling the process in entirety, an undoubtedly daunting task, modellers took the simplified approach of focusing upon individual linguistic behaviours, leading to much research into relatively con strained problems such as understanding over and undergeneralisation errors (Plunkett, Sinha, Moller & Strandsby, 1992), single word learning (Regier, 2005), syntactic category acquisition (Redington, Chater & Finch, 1988) and past tense learning (Rumelhart & Mcclelland, 1986).…”
Section: Introductionmentioning
confidence: 99%
“…The details of the learning curves that are generated from our procedure of machine learning are as described in the axioms but will not be analyzed here, because we already have a very thorough discussion of learning curves in our earlier paper (Suppes, Liang, & Bottner, 1992). We also do not analyze here in detail the learning of denotational value as characterized in Axiom 2.3.…”
Section: Grammars Generatedmentioning
confidence: 99%
“…*The results reported were originally presented at the Eighth Amsterdam Colloquium, December 17-20,1991. Our axioms, given in Section 2, for machine learning of comprehension are long and complicated, even though we analyze here the learning of relatively small fragments of the natural languages considered, but they do faithfully reflect the computer implementation, as well as extend the axioms given in our earlier paper (Suppes, Liang, & Bottner, 1992). We do believe the learning principles expressed in the axioms can serve as the basis for learning much more extended fragments and represent concepts that must be present in any system that starts with no knowledge of the natural language to be learned.…”
Section: Introduction and Overviewmentioning
confidence: 99%
“…The details of the learning curves that are generated from our procedure of machine learning are as described in the axioms but will not be analyzed here, because we already have a very thorough discussion of learning curves in our earlier paper (Suppes, Liang, & Böttner, 1992). We also do not analyze here in detail the learning of denotational value as characterized in Axiom 2.3.…”
Section: Grammars Generatedmentioning
confidence: 99%
“…Our axioms, given in Section 2, for machine learning of comprehension are long and complicated, even though we analyze here the learning of relatively small fragments of the natural languages considered, but they do faithfully reflect the computer implementation, as weil as extend the axioms given in our earlier paper (Suppes, Liang, & Böttner, 1992). We do believe the learning principles expressed in the axioms can serve as the basis for learning much more extended fragments and represent concepts that taust be present in any system that starts with no knowledge of the natural language to be learned.…”
Section: Introduction and Overviewmentioning
confidence: 99%