2022
DOI: 10.1080/0020174x.2022.2113429
|View full text |Cite
|
Sign up to set email alerts
|

The boundaries of meaning: a case study in neural machine translation

Abstract: The success of deep learning in natural language processing raises intriguing questions about the nature of linguistic meaning and ways in which it can be processed by natural and artificial systems. One such question has to do with subword segmentation algorithms widely employed in language modeling, machine translation, and other tasks since 2016. These algorithms often cut words into semantically opaque pieces, such as 'period', 'on', 't', and 'ist' in 'period|on|t|ist'. The system then represents the resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 50 publications
0
1
0
Order By: Relevance
“…Out of which, the authors developed one phrase-based SMT system and one NMT system using byte-pair embedding for the HI↔MR pair. In [51], authors used a Transformer-based NMT with sentencepiece for subword embedding on HI↔MR language pair [61]. In [52], authors used the Transformer-NMT for multilingual model training and evaluated the result on the HI↔MR pair.…”
Section: Related Workmentioning
confidence: 99%
“…Out of which, the authors developed one phrase-based SMT system and one NMT system using byte-pair embedding for the HI↔MR pair. In [51], authors used a Transformer-based NMT with sentencepiece for subword embedding on HI↔MR language pair [61]. In [52], authors used the Transformer-NMT for multilingual model training and evaluated the result on the HI↔MR pair.…”
Section: Related Workmentioning
confidence: 99%