2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489656
|View full text |Cite
|
Sign up to set email alerts
|

A Fully Attention-Based Information Retriever

Abstract: Recurrent neural networks are now the state-ofthe-art in natural language processing because they can build rich contextual representations and process texts of arbitrary length. However, recent developments on attention mechanisms have equipped feedforward networks with similar capabilities, hence enabling faster computations due to the increase in the number of operations that can be parallelized. We explore this new type of architecture in the domain of question-answering and propose a novel approach that w… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…Investigating data efficiency of similar solutions to tasks like QA (Question Answering, Correia et al, 2018) with standard datasets such as SQuAD (Rajpurkar et al, 2018) could also be valuable.…”
Section: Efficiency Resultsmentioning
confidence: 99%
“…Investigating data efficiency of similar solutions to tasks like QA (Question Answering, Correia et al, 2018) with standard datasets such as SQuAD (Rajpurkar et al, 2018) could also be valuable.…”
Section: Efficiency Resultsmentioning
confidence: 99%