2020
DOI: 10.1007/978-3-030-51310-8_8
|View full text |Cite
|
Sign up to set email alerts
|

Natural Language Generation Using Transformer Network in an Open-Domain Setting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…(Gillioz et al, 2020). Not long after its introduction, the research community in Natural Language Processing began to explore its application to some sequential tasks such as text generation (Varshney et al, 2020;Mishra et al, 2021), machine translation and language understanding (Banar et al, 2021).…”
Section: Recent Advancements and Future Directionsmentioning
confidence: 99%
“…(Gillioz et al, 2020). Not long after its introduction, the research community in Natural Language Processing began to explore its application to some sequential tasks such as text generation (Varshney et al, 2020;Mishra et al, 2021), machine translation and language understanding (Banar et al, 2021).…”
Section: Recent Advancements and Future Directionsmentioning
confidence: 99%
“…Recently, the transformer architecture started to substitute RNNs for the language model training [44][45][46][47]. One of the major advantages of transformers is that they achieve independency while making predictions on different dialogue stages based on the selfattention mechanism.…”
Section: Dialogue Management Modulementioning
confidence: 99%
“…They were dis covered in Vaswani et al (2017) and were able to beat several state-of-the-art approaches on different NLP tasks. They have been employed in machine language translation ( Sefara et al., 2021 ), conversational agents ( Golovanov et al., 2020 ), sentiment analysis ( Pipalia, Bhadja & Shukla, 2020 ), language generation ( Varshney et al., 2020 ), text summarization ( Luo, Guo & Guo, 2019 ) and so on bringing a huge advancement to the literature in different tasks and domains. Transformers have been already successfully applied to perform music genre classification and generation ( Qiu, Li & Sung, 2021 ; Huang et al., 2018 ).…”
Section: Introductionmentioning
confidence: 99%