2016 23rd International Conference on Pattern Recognition (ICPR) 2016
DOI: 10.1109/icpr.2016.7900267
|View full text |Cite
|
Sign up to set email alerts
|

What does scene text tell us?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…Those methods will be useful for analyzing the semantics of scene texts. In [20], the words from scenery images were fed to word2vec and then their semantic vectors are classified into several clusters by k-means, in order to understand what kind of information is conveyed by scene texts. It is important to note that we can apply the word embedding methods not only to the words in scene texts but also to the object class names (as we see in Section IV-D).…”
Section: Techniques Required For Our Experimental Surveymentioning
confidence: 99%
See 1 more Smart Citation
“…Those methods will be useful for analyzing the semantics of scene texts. In [20], the words from scenery images were fed to word2vec and then their semantic vectors are classified into several clusters by k-means, in order to understand what kind of information is conveyed by scene texts. It is important to note that we can apply the word embedding methods not only to the words in scene texts but also to the object class names (as we see in Section IV-D).…”
Section: Techniques Required For Our Experimental Surveymentioning
confidence: 99%
“…Table I shows the 40 most frequent words, as one of the most primitive statistics of the words captured in the real images; specifically, this is the list of the words with the 40 largest i g i,k . As suggested in [20], the words captured in scenery images will have a different trend from the words in messages like Wikipedia and British National Corpus (BNC) dataset 7 . The parenthesized number in Table I is the frequency rank of the word in BNC (after the same post-processing steps, such as the stop-word removal).…”
Section: A Frequent Wordsmentioning
confidence: 99%