Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.255
|View full text |Cite
|
Sign up to set email alerts
|

Modelling Context Emotions using Multi-task Learning for Emotion Controlled Dialog Generation

Abstract: A recent topic of research in natural language generation has been the development of automatic response generation modules that can automatically respond to a user's utterance in an empathetic manner. Previous research has tackled this task using neural generative methods by augmenting emotion classes with the input sequences. However, the outputs by these models may be inconsistent. We employ multitask learning to predict the emotion label and to generate a viable response for a given utterance using a commo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 29 publications
(30 reference statements)
0
4
0
Order By: Relevance
“…The study provides a systematic review of approaches in building an emotionally-aware chatbot (EAC). In [55] the authors use multi-task learning to predict the emotion label and generate a valid response for a given utterance. Their model consists of a self-attention based encoder and a decoder with dot product attention mechanism to generate response with a specified emotion and produce more emotionally relevant responses.…”
Section: Related Workmentioning
confidence: 99%
“…The study provides a systematic review of approaches in building an emotionally-aware chatbot (EAC). In [55] the authors use multi-task learning to predict the emotion label and generate a valid response for a given utterance. Their model consists of a self-attention based encoder and a decoder with dot product attention mechanism to generate response with a specified emotion and produce more emotionally relevant responses.…”
Section: Related Workmentioning
confidence: 99%
“…It uses an attention weight to measure the importance of each word in the input text. Different attention mechanisms like self-attention [38], multi head-attention [39], word-level attention [40], hierarchical attention [41], are used in various research papers in conversational agents. Deep learning methods such as RNN and its different variations has own limits.…”
Section: Deep Learning Approaches Used In Conversational Agents -mentioning
confidence: 99%
“…So, ambiguity in word semantic interpretations is one of the important issues in CAs. [38] -Datasets plays an important role as training data is required to understand intent and context and respond naturally to user. There are many challenges related to datasets like small datasets available, scarcity of labeled data, unbalanced distribution of data, less variety of labels in datasets, and lack of representative publicly available datasets.…”
Section: Research Gapsmentioning
confidence: 99%
See 1 more Smart Citation