Proceedings of the Workshop on Machine Reading for Question Answering 2018
DOI: 10.18653/v1/w18-2606
|View full text |Cite
|
Sign up to set email alerts
|

Robust and Scalable Differentiable Neural Computer for Question Answering

Abstract: Deep learning models are often not easily adaptable to new tasks and require task-specific adjustments. The differentiable neural computer (DNC), a memoryaugmented neural network, is designed as a general problem solver which can be used in a wide range of tasks. But in reality, it is hard to apply this model to new tasks. We analyze the DNC and identify possible improvements within the application of question answering. This motivates a more robust and scalable DNC (rsDNC). The objective precondition is to ke… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 7 publications
(6 reference statements)
0
13
0
Order By: Relevance
“…This suggests that the agent improves its performance when it learns to exploit the relevant step information stored in the memory. Note that the matching accuracy is influenced both by the high variability of the learning process in MANNs [Franke et al, 2018] and by the fact that the memory output can be sometimes ignored, in favor of the LSTM output. This is highlighted by the variance -although it remains reliable (accuracy ≥ 0.6) also in the worst-case scenarios.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…This suggests that the agent improves its performance when it learns to exploit the relevant step information stored in the memory. Note that the matching accuracy is influenced both by the high variability of the learning process in MANNs [Franke et al, 2018] and by the fact that the memory output can be sometimes ignored, in favor of the LSTM output. This is highlighted by the variance -although it remains reliable (accuracy ≥ 0.6) also in the worst-case scenarios.…”
Section: Methodsmentioning
confidence: 99%
“…Differential Neural Computers (DNCs) [Graves et al, 2016] have been recently introduced as a memory-augmented machine learning model based on the usage of a controller network. A controller is a neural network -typically recurrent with LSTM units -that (1) has read and write access to an external memory through multiple read and write heads, and (2) learns both how to perform these operations and on which data to run them.…”
Section: Simplified Dncmentioning
confidence: 99%
See 1 more Smart Citation
“…Various types of learning were considered: supervised, unsupervised, semi-supervised, weakly-supervised, transfer learning, reinforcement learning, active learning, one-shot learning etc. The Differential Neural Computer model (it augments a neural network with memory) [26], structural equation models [27], correlations with p-value testing, Granger causality tests and an abductive reasoning approach (logical inferencing that starts with one or more observations and searches for the simplest explanation for it) were also considered.…”
Section: Related Workmentioning
confidence: 99%
“…These works keep using memory for storing data rather than the weights of the network and thus parallel to our approach. Other DNC modifications [8,9] are also orthogonal to our work.…”
Section: Related Workmentioning
confidence: 99%