The paper begins by defining a class of distributed memory machines which have useful properties as retrieval and filtering devices. These memory mechanisms store large numbers of associations on a single composite vector.They provide a natural format for encoding the syntactic and semantic constraints associated with linguistic elements.A computational architecture for parsing natural language is proposed which utillses the retrieval and associative features of these devices. The parsing mechanism is based on the principles of Lexlcal Functional Grammar and the paper demonstrates how these principles can be derived from the properties of the memory mechanisms.
The idea that subjects often use imagery to discriminate semantically similar sentences was tested in three experiments. In the first experiment, subjects heard subject-verb-object sentences in the context of either a comprehension task or an image-generation task. Their memory for the sentences was tested using a two-alternative forced-choice recognition test in which different types of distractor sentence were used. A sentence semantically similar to the target sentence was one type; a sentence with the same subject and object nouns as the target sentence, but dissimilar in meaning, was another type; and a sentence similar in meaning to one of the stimulus sentences, but not to the target sentence, was a third type. The results showed that the image-generation instructions enhanced later recognition performance, but only for semantically similar test items. A second experiment showed that this finding only holds for highimagery sentences containing concrete noun concepts. A third experiment demonstrated that the enhanced recognition performance could not be accounted for in terms of a semantic model of test-item discrimination. Collectively, the results were interpreted as providing evidence for the notion that subjects discriminate the semantically similar test items by elaborating the sentence encodings through image processing.631
The paper begins by defining a class of distributed memory machines which have useful properties as retrieval and filtering devices. These memory mechanisms store large numbers of associations on a single composite vector. They provide a natural format for encoding the syntactic and semantic constraints associated with linguistic elements. A computational architecture for parsing natural language is proposed which utillses the retrieval and associative features of these devices. The parsing mechanism is based on the principles of Lexlcal Functional Grammar and the paper demonstrates how these principles can be derived from the properties of the memory mechanisms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.