Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1171
|View full text |Cite
|
Sign up to set email alerts
|

Neural Duplicate Question Detection without Labeled Training Data

Abstract: Supervised training of neural models to duplicate question detection in community Question Answering (cQA) requires large amounts of labeled question pairs, which are costly to obtain. To minimize this cost, recent works thus often used alternative methods, e.g., adversarial domain adaptation. In this work, we propose two novel methods: (1) the automatic generation of duplicate questions, and (2) weak supervision using the title and body of a question. We show that both can achieve improved performances even t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 34 publications
(58 reference statements)
0
13
0
Order By: Relevance
“…We optimize the binary cross-entropy loss. Similar techniques achieve state-of-the-art results on many related datasets (Garg et al, 2020;Mass et al, 2019;Rücklé et al, 2019b).…”
Section: Models and Trainingmentioning
confidence: 81%
See 4 more Smart Citations
“…We optimize the binary cross-entropy loss. Similar techniques achieve state-of-the-art results on many related datasets (Garg et al, 2020;Mass et al, 2019;Rücklé et al, 2019b).…”
Section: Models and Trainingmentioning
confidence: 81%
“…Poerner and Schütze (2019) adapt the combination of different sentence embeddings to individual target domains. Rücklé et al (2019b) use weakly supervised training, self-supervised training methods, and question generation. Similar approaches were also explored in ad-hoc retrieval (Zhang et al, 2020;Ma et al, 2020;MacAvaney et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations