2021
DOI: 10.1016/j.eswa.2021.114910
|View full text |Cite
|
Sign up to set email alerts
|

A non-factoid question answering system for prior art search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…BERT (Bidirectional Encoder Representations for Transformers) is a new language representation model that has obtained results in several natural language processing tasks (Devlin et al 2019). BERT considers a text sequence in both directions (from left to right and from right to left) for a given task, which (in contrast to LDA) enables it to characterize the syntactic and semantic features of the context (Zihayat and Etwaroo 2021). When LDA and BERT are combined, in the form of LDA-BERT, topic modeling has been shown to be useful in analyzing literature on arti cial intelligence in sustainable energy and has successfully extracted major research topics in that eld (Saheb et al 2022).…”
Section: Introductionmentioning
confidence: 99%
“…BERT (Bidirectional Encoder Representations for Transformers) is a new language representation model that has obtained results in several natural language processing tasks (Devlin et al 2019). BERT considers a text sequence in both directions (from left to right and from right to left) for a given task, which (in contrast to LDA) enables it to characterize the syntactic and semantic features of the context (Zihayat and Etwaroo 2021). When LDA and BERT are combined, in the form of LDA-BERT, topic modeling has been shown to be useful in analyzing literature on arti cial intelligence in sustainable energy and has successfully extracted major research topics in that eld (Saheb et al 2022).…”
Section: Introductionmentioning
confidence: 99%