2023
DOI: 10.1016/j.neucom.2023.126518
|View full text |Cite
|
Sign up to set email alerts
|

Neural module networks: A review

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…In the field of neural networks some examples are: 1) "Oldschool" attention (Bahdanau et al, 2015;Jain & Wallace, 2019), where attention points to which input tokens are important. 2) Neural Modular Networks (Andreas et al, 2016;Gupta et al, 2020;Fashandi, 2023), which produce a prediction via a sequence of sub-models, each with known behavior. 3) Prototypical Networks (Bien & Tibshirani, 2009;Kim et al, 2014;Chen et al, 2019), which predicts by finding similar training observations.…”
Section: The Intrinsic Paradigmmentioning
confidence: 99%
See 1 more Smart Citation
“…In the field of neural networks some examples are: 1) "Oldschool" attention (Bahdanau et al, 2015;Jain & Wallace, 2019), where attention points to which input tokens are important. 2) Neural Modular Networks (Andreas et al, 2016;Gupta et al, 2020;Fashandi, 2023), which produce a prediction via a sequence of sub-models, each with known behavior. 3) Prototypical Networks (Bien & Tibshirani, 2009;Kim et al, 2014;Chen et al, 2019), which predicts by finding similar training observations.…”
Section: The Intrinsic Paradigmmentioning
confidence: 99%
“…Likewise, Neural Modular Networks produce an executable problem composed of sub-networks, such as find-max-num(filter(find())), which is interpretable (Fashandi, 2023;Andreas et al, 2016;Gupta et al, 2020). However, each sub-networks (find-max-num, filter, find) is itself a black-box model with little guarantee that it operates as intended (Amer & Maul, 2019;Subramanian et al, 2020;Lyu et al, 2022).…”
Section: The Case Against the Intrinsic Paradigmmentioning
confidence: 99%