2003
DOI: 10.1109/tkde.2003.1232265
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of standard spell checking algorithms and a novel binary neural approach

Abstract: In this paper, we propose a simple, flexible, and efficient hybrid spell checking methodology based upon phonetic matching, supervised learning, and associative matching in the AURA neural system. We integrate Hamming Distance and n-gram algorithms that have high recall for typing errors and a phonetic spell-checking algorithm in a single novel architecture. Our approach is suitable for any spell checking application though aimed toward isolated word error correction, particularly spell checking user queries i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0
2

Year Published

2007
2007
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(33 citation statements)
references
References 11 publications
0
30
0
2
Order By: Relevance
“…The field of document retrieval-and information retrieval in general-can be easily seen to be an instance of max-kernel search: for some given similarity function, return the document that is most similar to the query. Spell checking systems are an interesting corollary of information retrieval and also an instance of maxkernel search [27].…”
Section: Max-kernel Searchmentioning
confidence: 99%
“…The field of document retrieval-and information retrieval in general-can be easily seen to be an instance of max-kernel search: for some given similarity function, return the document that is most similar to the query. Spell checking systems are an interesting corollary of information retrieval and also an instance of maxkernel search [27].…”
Section: Max-kernel Searchmentioning
confidence: 99%
“…Church and Gale (1990) demonstrate the potential of word bigrams to improve the accuracy of isolated word correction Mays et al (1991) used trigram models and obtained 76% accuracy in detection and 73% accuracy in correction. Hodge and Austin (2003) integrate Hamming distance and n-gram algorithms that have high recall for typing errors and a phonetic spell-checking algorithm in a single architecture. Ahmed et al (2010) propose a spell checker that works by selecting the most promising candidates from a ranked list that is derived from n-gram statistics and lexical resources.…”
Section: Correction Candidate Selectionmentioning
confidence: 99%
“…Many approaches have been applied since people started to deal with this problem. Different techniques like edit distance [4], rule-based techniques [10], n-grams [20], probabilistic techniques [14], neural nets [15], similarity key techniques [16,17] and noisy channel model [18,19] have been proposed. All of these are based on the idea of calculating the similarity between the misspelled word and the words contained in a dictionary.…”
Section: Approaches Of Some Spell Checkersmentioning
confidence: 99%
“…Therefore, it seems reasonable to base correction algorithms on measures that consider these simple operations. However, approaches based on pure n- gram statistics (which account for these operations implicitly) have also proven to provide good performance [1,15].…”
Section: Introductionmentioning
confidence: 99%