Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021) 2021
DOI: 10.18653/v1/2021.repl4nlp-1.6
|View full text |Cite
|
Sign up to set email alerts
|

Temporal-aware Language Representation Learning From Crowdsourced Labels

Abstract: Learning effective language representations from crowdsourced labels is crucial for many real-world machine learning tasks. A challenging aspect of this problem is that the quality of crowdsourced labels suffer high intraand inter-observer variability. Since the highcapacity deep neural networks can easily memorize all disagreements among crowdsourced labels, directly applying existing supervised language representation learning algorithms may yield suboptimal solutions. In this paper, we propose TACMA, a temp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 33 publications
(36 reference statements)
0
1
0
Order By: Relevance
“…Moreover, scNAME ignores the relationships between cells. With the rise of graph contrastive learning in the field of graph representation learning ( Thakoor et al , 2021 ), it will become a new idea to use graph contrastive learning to capture the relationship between cells and recover missing gene expression values.…”
Section: Introductionmentioning
confidence: 99%
“…Moreover, scNAME ignores the relationships between cells. With the rise of graph contrastive learning in the field of graph representation learning ( Thakoor et al , 2021 ), it will become a new idea to use graph contrastive learning to capture the relationship between cells and recover missing gene expression values.…”
Section: Introductionmentioning
confidence: 99%