2022
DOI: 10.1016/j.neucom.2022.05.114
|View full text |Cite
|
Sign up to set email alerts
|

Functional gradient descent for n-tuple regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…N-tuple Regressors have the advantage of combining low usage of computational resources in training and prediction, with the high capability of generalization with a low number of hyperparameters. In this context, is important to include a forgetting mechanism in order to deal with the changes in the Q distribution [12]. Policy The policy uses the Q values predicted for each action to choose how to act.…”
Section: State and Encodermentioning
confidence: 99%
See 1 more Smart Citation
“…N-tuple Regressors have the advantage of combining low usage of computational resources in training and prediction, with the high capability of generalization with a low number of hyperparameters. In this context, is important to include a forgetting mechanism in order to deal with the changes in the Q distribution [12]. Policy The policy uses the Q values predicted for each action to choose how to act.…”
Section: State and Encodermentioning
confidence: 99%
“…The prediction process is the average of the outputs of repeating patterns in the input. Katopodis et al [12] proposed the inclusion of a forgetting mechanism using a discounting rate, which improves the performance of the model in sequential decision contexts, such as reinforcement learning.…”
Section: Introductionmentioning
confidence: 99%