2022
DOI: 10.3102/10769986221116905
|View full text |Cite
|
Sign up to set email alerts
|

A Collection of Numerical Recipes Useful for Building Scalable Psychometric Applications

Abstract: This article is concerned with a subset of numerically stable and scalable algorithms useful to support computationally complex psychometric models in the era of machine learning and massive data. The subset selected here is a core set of numerical methods that should be familiar to computational psychometricians and considers whitening transforms for dealing with correlated data, computational concepts for linear models, multivariable integration, and optimization techniques.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 62 publications
0
2
0
Order By: Relevance
“…Several methods have been proposed in the literature for high-dimensional integral approximation, including stochastic expectation maximization (EM), Metropolis-Hastings Robbins-Monro (MH-RM), and both fixed and adaptive quadrature [19]. Stochastic methods are considered more computationally efficient than numeric quadrature when the number of latent dimensions (i.e., number of constructs) exceeds three, as the requisite number of evaluations only increases linearly [20].…”
Section: Integral Approximationmentioning
confidence: 99%
See 1 more Smart Citation
“…Several methods have been proposed in the literature for high-dimensional integral approximation, including stochastic expectation maximization (EM), Metropolis-Hastings Robbins-Monro (MH-RM), and both fixed and adaptive quadrature [19]. Stochastic methods are considered more computationally efficient than numeric quadrature when the number of latent dimensions (i.e., number of constructs) exceeds three, as the requisite number of evaluations only increases linearly [20].…”
Section: Integral Approximationmentioning
confidence: 99%
“…Adaptive quadrature tries to overcome this limitation by decreasing the number of points needed for accurate approximation, but is still subject to computational complexity that grows exponentially with the number of dimensions [21,22]. For a recent review of related methods, see [19].…”
Section: Integral Approximationmentioning
confidence: 99%