1995
DOI: 10.1145/219717.219748
|View full text |Cite
|
Sign up to set email alerts
|

WordNet

Abstract: Because meaningful sentences are composed of meaningful words, any system that hopes to process natural languages as people do must have information about words and their meanings. This information is traditionally provided through dictionaries, and machine-readable dictionaries are now widely available. But dictionary entries evolved for the convenience of human readers, not for machines. WordNet 1 provides a more effective combination of traditional lexicographic information and moder… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
3,741
0
67

Year Published

1998
1998
2016
2016

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 10,351 publications
(3,812 citation statements)
references
References 4 publications
4
3,741
0
67
Order By: Relevance
“…Linguists have tried to characterize the meaning of a word with featurebased approaches, such as semantic roles (Kipper et al, 2006), as well as word-relation approaches, such as WordNet (Miller, 1995). Computational linguists have demonstrated that a word's meaning is captured to some extent by the distribution of words and phrases with which it commonly co-occurs (Church and Hanks, 1990).…”
Section: Introductionmentioning
confidence: 99%
“…Linguists have tried to characterize the meaning of a word with featurebased approaches, such as semantic roles (Kipper et al, 2006), as well as word-relation approaches, such as WordNet (Miller, 1995). Computational linguists have demonstrated that a word's meaning is captured to some extent by the distribution of words and phrases with which it commonly co-occurs (Church and Hanks, 1990).…”
Section: Introductionmentioning
confidence: 99%
“…Figure 3 shows an example of such a word network, for example, "cash dividend", "dividend report", and "market influence" are examples of bi-gram word pairs from a financial news data sample. LLA is related to Latent Semantic Analysis (LSA, Dumais, Furnas, Landauer and Deerwester, 1988), Probabilistic Latent Semantic Analysis (PLSA, Hofmann, 1999), WordNet (Miller, 1995), Automap (CASOS, 2009), and LDA (Blei, Ng and Jordan, 2003). LDA uses a bag of single words (e.g., associations are computed at the word level) to extract concepts and topics.…”
Section: Machine Vision and Llamentioning
confidence: 99%
“…For defining word associations and finding related expressions, we use the well-known WordNet [30] lexical database containing a hierarchical structure of syntactically related nouns, verbs, adjectives, and adverbs grouped into sets of cognitive synonyms (synsets), each expressing a distinct concept. We also use statistical NGram models trained with the Brown Corpus [16] to define word probabilities and co-occurrences.…”
Section: Lexical Resourcesmentioning
confidence: 99%
“…• The WordNet Expert generates synonyms, hypernyms, and antonyms for nouns and adjectives based on the WordNet lexical resource [30].…”
Section: Expertsmentioning
confidence: 99%