2021
DOI: 10.17148/ijarcce.2021.1012
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Recently, deep learning methods have greatly improved the state-of-the-art in many natural language processing tasks. Previous work shows that the Transformer can capture long-distance relations between words in a sequence. In this paper, we propose a Transformer-based neural model for Chinese word segmentation and part-ofspeech tagging. In the model, we present a word boundary-based character embedding method to overcome the character ambiguity problem. After the Transformer layer, BiLSTM-CRF layer is used to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 18 publications
(24 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?