2021
DOI: 10.18280/ria.350404
|View full text |Cite
|
Sign up to set email alerts
|

Query-Based Retrieval Using Universal Sentence Encoder

Abstract: In Natural language processing, various tasks can be implemented with the features provided by word embeddings. But for obtaining embeddings for larger chunks like sentences, the efforts applied through word embeddings will not be sufficient. To resolve such issues sentence embeddings can be used. In sentence embeddings, complete sentences along with their semantic information are represented as vectors so that the machine finds it easy to understand the context. In this paper, we propose a Question Answering … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 12 publications
(12 reference statements)
0
1
0
Order By: Relevance
“…Even though the second one requires less processing, it is less accurate. A 512-dimensional vector is produced when different lengths of English text are provided [24]. Sentences were encoded into embedding vectors using two alternative methods.…”
Section: Sentence Embedding: Transformer-based Usementioning
confidence: 99%
“…Even though the second one requires less processing, it is less accurate. A 512-dimensional vector is produced when different lengths of English text are provided [24]. Sentences were encoded into embedding vectors using two alternative methods.…”
Section: Sentence Embedding: Transformer-based Usementioning
confidence: 99%