2022
DOI: 10.1016/j.patcog.2021.108260
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian compression for dynamically expandable networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…Evaluating and comparing neurosymbolic generative models to regular generative models is crucial for understanding their strengths and weaknesses. This can be achieved through various benchmarks and tasks, as well as by analyzing their performance in different domains and applications [24]. Metrics such as accuracy, interpretability, and efficiency can be used to compare the performance of neurosymbolic generative models to purely symbolic or neural approaches, providing insights into their effectiveness and potential for future research and applications.…”
Section: Evaluation and Comparison Of Neurosymbolic Generative Modelsmentioning
confidence: 99%
“…Evaluating and comparing neurosymbolic generative models to regular generative models is crucial for understanding their strengths and weaknesses. This can be achieved through various benchmarks and tasks, as well as by analyzing their performance in different domains and applications [24]. Metrics such as accuracy, interpretability, and efficiency can be used to compare the performance of neurosymbolic generative models to purely symbolic or neural approaches, providing insights into their effectiveness and potential for future research and applications.…”
Section: Evaluation and Comparison Of Neurosymbolic Generative Modelsmentioning
confidence: 99%
“…In online learning, when a DNN model is utilized for continuous learning, it uses the lifelong learning concept 11 . Researchers have developed various methods to implement lifelong learning 12 .…”
Section: Related Workmentioning
confidence: 99%
“…In this approach, the model can only increase the neurons in the topmost layer. Furthermore, Yoon et al 11 . have offered another approach of dynamically expandable networks in which neurons are increased at the desired layer.…”
Section: Related Workmentioning
confidence: 99%