2017
DOI: 10.1111/rssa.12300
|View full text |Cite
|
Sign up to set email alerts
|

Mixed Generalized Akaike Information Criterion for Small Area Models

Abstract: Summary A mixed generalized Akaike information criterion xGAIC is introduced and validated. It is derived from a quasi‐log‐likelihood that focuses on the random effect and the variability between the areas, and from a generalized degree‐of‐freedom measure, as a model complexity penalty, which is calculated by the bootstrap. To study the performance of xGAIC, we consider three popular mixed models in small area inference: a Fay–Herriot model, a monotone model and a penalized spline model. A simulation study sho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
27
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(28 citation statements)
references
References 35 publications
(84 reference statements)
1
27
0
Order By: Relevance
“…found that both the McAIC and a model averaging procedure (which has more appropriate weights) depending on McAIC, work better than cAIC in terms of prediction errors. They prove empirically the same results in the case of small area prediction, which is the topic on which and Lombardía et al (2017) focus on. They show, therefore, a prediction error improvement of CScAIC with respect to cAIC.…”
Section: Review Of Simulationssupporting
confidence: 58%
See 3 more Smart Citations
“…found that both the McAIC and a model averaging procedure (which has more appropriate weights) depending on McAIC, work better than cAIC in terms of prediction errors. They prove empirically the same results in the case of small area prediction, which is the topic on which and Lombardía et al (2017) focus on. They show, therefore, a prediction error improvement of CScAIC with respect to cAIC.…”
Section: Review Of Simulationssupporting
confidence: 58%
“…when σ 2 is unknown and estimated by its ML estimator and B * c is the bias correction. Lombardía et al (2017) introduce a mixed generalized Akaike information criterion, xGAIC, for SAE models. One typical model used in the field of SAE is the Fay-Herriot model, which is a particular type of LMMs containing only one random effect, the intercept.…”
Section: Fixed Effects Selectionmentioning
confidence: 99%
See 2 more Smart Citations
“…The prediction quality could be measured using various criteria, for instance, Akaike's Information Criterion (AIC)-"in general terms, the value of AIC for a model M is defined as AIC(M) = −2 log {l(M)} + 2D, where l(M) is the model likelihood and D is a penalty term, which was originally equal to the number of parameters in the model, p" [75]. Another alternative is Corrected Akaike's Information Criterion (CAIC)-a bias-corrected version of AIC for a small number of observations used for estimation [74].…”
Section: Risk Assessment Workflow Using Machine Learningmentioning
confidence: 99%