Proceedings of the 12th International Workshop on Semantic Evaluation 2018
DOI: 10.18653/v1/s18-1034
|View full text |Cite
|
Sign up to set email alerts
|

deepSA2018 at SemEval-2018 Task 1: Multi-task Learning of Different Label for Affect in Tweets

Abstract: This paper describes our system implementation for subtask V-oc of SemEval-2018 Task 1: affect in tweets. We use multi-task learning method to learn shared representation, then learn the features for each task. There are five classification models in the proposed multitask learning approach. These classification models are trained sequentially to learn different features for different classification tasks. In addition to the data released for SemEval-2018, we use datasets from previous SemEvals during system c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…The main idea of the attention mechanism is intuitively similar to human visual attention, "observing important parts". At present, attention mechanisms have been shown effective in various machine learning tasks such as image / video captioning [6], machine translation [2], and natural language processing [21]. Recently, neural network attention mechanisms have been used in recommendation systems [11].…”
Section: Attention Mechanismmentioning
confidence: 99%
“…The main idea of the attention mechanism is intuitively similar to human visual attention, "observing important parts". At present, attention mechanisms have been shown effective in various machine learning tasks such as image / video captioning [6], machine translation [2], and natural language processing [21]. Recently, neural network attention mechanisms have been used in recommendation systems [11].…”
Section: Attention Mechanismmentioning
confidence: 99%
“…Particularly, it equips a neural network to be capable of selecting the most important parts of the target input such as a specific word in a given review or a particular region in an image. This idea has been usefully applied to a number of applications such as computer vision [30], machine translation [31] and natural language processing [32]. More recently neural attention has been exploited for building recommender system [12], [21], [33].…”
Section: Attentive Recommender Systemsmentioning
confidence: 99%