2018
DOI: 10.3389/fpsyg.2017.02302
|View full text |Cite
|
Sign up to set email alerts
|

Expectation-Maximization-Maximization: A Feasible MLE Algorithm for the Three-Parameter Logistic Model Based on a Mixture Modeling Reformulation

Abstract: Stable maximum likelihood estimation (MLE) of item parameters in 3PLM with a modest sample size remains a challenge. The current study presents a mixture-modeling approach to 3PLM based on which a feasible Expectation-Maximization-Maximization (EMM) MLE algorithm is proposed. The simulation study indicates that EMM is comparable to the Bayesian EM in terms of bias and RMSE. EMM also produces smaller standard errors (SEs) than MMLE/EM. In order to further demonstrate the feasibility, the method has also been ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

5
1

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 46 publications
0
13
0
Order By: Relevance
“…A reformulation of the ability-based 3PLM, similar to Zheng et al (2017), can be derived readily. Following Culpepper (2015), we may introduce a latent variable v ij ∈ V , and vij~ Bernoulli(Pi*(θj)):…”
Section: Mixture-modeling Approach To the 3plmmentioning
confidence: 96%
See 3 more Smart Citations
“…A reformulation of the ability-based 3PLM, similar to Zheng et al (2017), can be derived readily. Following Culpepper (2015), we may introduce a latent variable v ij ∈ V , and vij~ Bernoulli(Pi*(θj)):…”
Section: Mixture-modeling Approach To the 3plmmentioning
confidence: 96%
“…Two arrangements of these processes can be identified: the g-process comes first or the way around, and thus two different versions of reformulation of the 3PLM can be developed. Interesting enough, the one proposed by Zheng et al (2017) corresponds to the one with the g-process coming first and Béguin and Glas (2001) proposed an ability-based reformulation for the three-parameter normal ogive model (3PNO) which coincides with the one with the p-process coming first. The two reformulations are briefly reviewed here as the starting point of the two BEMM algorithms…”
Section: Mixture-modeling Approach To the 3plmmentioning
confidence: 99%
See 2 more Smart Citations
“…The R package IRTBEMM (Guo et al, 2020) is designed to implement the family of the Bayesian Expectation-Maximization-Maximization (BEMM) algorithm to estimate unidimensional dichotomous item response models with guessing or slipping parameters. The BEMM family includes (a) the BEMM algorithm (Guo & Zheng, 2019) for three-parameter logistic (3PL) model and one-parameter logistic guessing (1PL-G) model; (b) the Bayesian Expectation-Maximization-Maximization-Maximization (BE3M) algorithm (Zhang et al, 2018) for four-parameter logistic (4PL) model and one-parameter logistic ability-based guessing (1PL-AG) model; and (c) their maximum likelihood estimation versions developed by Zheng et al (2018). Both Bayesian modal estimates and maximum likelihood estimates are available.…”
mentioning
confidence: 99%