1996
DOI: 10.1007/bf00140869
|View full text |Cite
|
Sign up to set email alerts
|

A general maximum likelihood analysis of overdispersion in generalized linear models

Abstract: This paper presents an EM algorithm for maximum likelihood estimation in generalized linear models with overdispersion. The algorithm is initially derived as a form of Gaussian quadrature assuming a normal mixing distribution, but with only slight variation it can be used for a completely unknown mixing distribution, giving a straightforward method for the fully nonparametric ML estimation of this distribution. This is of value because the ML estimates of the GLM parameters may be sensitive to the specificatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
165
0
3

Year Published

1999
1999
2008
2008

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 186 publications
(172 citation statements)
references
References 34 publications
(43 reference statements)
4
165
0
3
Order By: Relevance
“…Less dramatic evidence occurs for other types of models (Davies, 1987;Neuhaus et al, 1992;Butler and Louis, 1992). In light of this evidence, some recent work has focused on non-parametric approaches (Heckman and Singer, 1984;Davies, 1987;Wood and Hinde, 1987;Follmann and Lambert, 1989;Butler and Louis, 1992;Wedel and DeSarbo, 1995;Aitkin, 1996;Aitkin, 1999). A referee has pointed out to us that this work has connections with a semi-parametric approach to estimation in the psychometric literature on latent trait and latent class models.…”
Section: Model Fitting: a Semi-parametric Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…Less dramatic evidence occurs for other types of models (Davies, 1987;Neuhaus et al, 1992;Butler and Louis, 1992). In light of this evidence, some recent work has focused on non-parametric approaches (Heckman and Singer, 1984;Davies, 1987;Wood and Hinde, 1987;Follmann and Lambert, 1989;Butler and Louis, 1992;Wedel and DeSarbo, 1995;Aitkin, 1996;Aitkin, 1999). A referee has pointed out to us that this work has connections with a semi-parametric approach to estimation in the psychometric literature on latent trait and latent class models.…”
Section: Model Fitting: a Semi-parametric Approachmentioning
confidence: 99%
“…Despite this, recommendations have relied on standard ML theory. For instance, the likelihood-ratio test has been used for testing fixed parameters and making model comparisons (Davies, 1987;Wood and Hinde, 1987;Aitkin, 1996). Hartzel (1999) examined the Wald and likelihood-ratio tests for multinomial logit random effects models and concluded that they provided reasonably appropriate inference for the NPML approach.…”
Section: Inference and Predictionmentioning
confidence: 99%
“…The "Fabric faults" data set consists of 32 observations of number of faults in rolls of fabric of different length (Aitkin, 1996). The dependent variable is the number of faults (n.fault) and the covariate is the length of role in meters (length).…”
Section: Fabric Faultsmentioning
confidence: 99%
“…A common strategy for guarding against such misspecification is to build more flexible distributional assumptions for the random effects into the model. For instance, Aitkin (1996Aitkin ( , 1999 proposed estimating this distribution nonparametrically as a finite number of mass points and corresponding probabilities. Magder and Zeger (1996), Verbeke & Lesaffre (1996), and Chen, Zhang & Davidian (2002) constructed alternative nonparametric estimates based on mixtures of Gaussian distributions.…”
Section: Introductionmentioning
confidence: 99%
“…That is, with the exception of the nonparametric approach of Aitkin (1996Aitkin ( , 1999, for which there exist GLIM macros, we are not aware of any commercial software or macros that easily implement these more general models. Before investing time programming one of these more robust approaches, one would like to be able to diagnose whether such methods are required.…”
Section: Introductionmentioning
confidence: 99%