2016 International Conference on Advanced Informatics: Concepts, Theory and Application (ICAICTA) 2016
DOI: 10.1109/icaicta.2016.7803115
|View full text |Cite
|
Sign up to set email alerts
|

Word2vec semantic representation in multilabel classification for Indonesian news article

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
21
0
2

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 23 publications
(24 citation statements)
references
References 8 publications
1
21
0
2
Order By: Relevance
“…The initial evaluations were passed only by using the baseline structure of GloVe embedding along with LSTM layers which proved to be costly as the micro F1 score that I got was comparatively less (about 0.57 for phase I and 0.61 for phase II) whereas the training time for significantly high. So, the accuracy of the model was improved through maintaining the semantic and syntactic features of statements intact by using the two novel types of research in the Natural Language Processing field which are Word2Vec(Word to Vector) (Mikolov et al, 2013;Rahmawati and Khodra, 2016) and GloVe(Global Vectors)( Baroni et al, 2014;Pennington et al, 2014) embedding layers. The Word2Vec embedding layers maintained the sentiments of the provided text whereas the GloVe embedding maintained the semantic feature of the text.…”
Section: Embedding Layersmentioning
confidence: 99%
“…The initial evaluations were passed only by using the baseline structure of GloVe embedding along with LSTM layers which proved to be costly as the micro F1 score that I got was comparatively less (about 0.57 for phase I and 0.61 for phase II) whereas the training time for significantly high. So, the accuracy of the model was improved through maintaining the semantic and syntactic features of statements intact by using the two novel types of research in the Natural Language Processing field which are Word2Vec(Word to Vector) (Mikolov et al, 2013;Rahmawati and Khodra, 2016) and GloVe(Global Vectors)( Baroni et al, 2014;Pennington et al, 2014) embedding layers. The Word2Vec embedding layers maintained the sentiments of the provided text whereas the GloVe embedding maintained the semantic feature of the text.…”
Section: Embedding Layersmentioning
confidence: 99%
“…To address this problem, the word representation feature was employed, which handles the semantics of words. Several works have shown that a semantic model can produce higher performance than a lexicon [12,13] because it captures the weight of similarities between words in a document. Therefore, this research conducted rhetorical sentence categorization by employing word embedding to detect the semantic meaning of words in scientific articles [7,14].…”
Section: Introductionmentioning
confidence: 99%
“…Munculnya banyak situs berita yang ada di Indonesia menyebabkan pertumbuhan berita yang cepat, banyak, dan tidak terkelompok. Situs news aggregator saat ini telah banyak berkembang dan telah secara otomatis melakukan pengelompokan berita berdasarkan topik yang sesuai [2]. Kategorisasi berita tersebut dibantu oleh komputer dengan algoritme klasifikasi dokumen.…”
unclassified
“…Hasil penggunaan Word2Vec sebagai fitur di penelitian yang lalu memiliki kinerja lebih baik jika dibandingkan dengan teknik representasi yang lain [20]. Penggunaan Word2Vec untuk klasifikasi dokumen berita di bahasa Indonesia juga telah diusulkan [2]. Hasil percobaan klasifikasi di bahasa Indonesia memberikan angka F1-Score terbaik, yaitu 81,10% dengan model yang digunakan adalah Skip-Gram dengan dimensi 200.…”
unclassified