Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations 2020
DOI: 10.18653/v1/2020.emnlp-demos.13
|View full text |Cite
|
Sign up to set email alerts
|

A Technical Question Answering System with Transfer Learning

Abstract: In recent years, the need for community technical question-answering sites has increased significantly. However, it is often expensive for human experts to provide timely and helpful responses on those forums. We develop TransTQA, which is a novel system that offers automatic responses by retrieving proper answers based on correctly answered similar questions in the past. TransTQA is built upon a siamese ALBERT network, which enables it to respond quickly and accurately. Furthermore, TransTQA adopts a standard… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(13 citation statements)
references
References 27 publications
0
11
0
Order By: Relevance
“…The pre-trained model captures rich semantic patterns by training on large-scale corpus. Recent works [25,36] have shown their effectiveness in various natural language processing areas. A notable example is BERT [9], which is built on the transformer architecture [30].…”
Section: Pre-training Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…The pre-trained model captures rich semantic patterns by training on large-scale corpus. Recent works [25,36] have shown their effectiveness in various natural language processing areas. A notable example is BERT [9], which is built on the transformer architecture [30].…”
Section: Pre-training Techniquesmentioning
confidence: 99%
“…Thus, they pave a way to help specific NLP tasks even when only a small amount of task-specific labeled data is available. In fact, these PLMs have been proven to significantly improve numerous natural language processing tasks such as text classification [28], question answering [36], and single sentence tagging [9]. It is evident from recent works that these PLMs can also help tackle aforementioned two challenges in query understanding [6,21].…”
Section: Introductionmentioning
confidence: 99%
“…Yu Wenhao et al [11] developed a TransTQA, which is a novel system that gives automatic responses by retrieving proper answers supported correctly answered similar questions within the past. TransTQA is configured with the ALBERT model consistent with the required arguments that defines the model architecture, producing transformer encoders.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The COBERT system searches a document of 59K corona virus-related literature made accessible through the Coronavirus Open Research Dataset Challenge (CORD-19) [9]. As of late, QA models have made huge advancements as far as both execution and throughput [10][11][12]. Such upgrades can be credited to the presentation of huge scope QA datasets and deep learning models, and the recent focus of the research towards the proficiency of such models.…”
Section: Introductionmentioning
confidence: 99%
“…Recent advances of deep learning technologies with transfer learning has achieved great success in a variety of NLP tasks (Ruder et al, 2019). Several research work in this domain greatly enrich the application and technology of transfer learning on question answering from different perspectives (Min et al, 2017;Deng et al, 2018;Castelli et al, 2020;Yu et al, 2020a). Although transfer learning has been successfully applied to various QA applications, its applicability to technical QA has yet to be investigated.…”
Section: Related Workmentioning
confidence: 99%