2014
DOI: 10.20982/tqmp.10.2.p143
|View full text |Cite
|
Sign up to set email alerts
|

Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

Abstract: Abstract Abstract Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA) or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009) argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM) covariances (or correlations) as input… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
45
0
1

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 71 publications
(47 citation statements)
references
References 5 publications
0
45
0
1
Order By: Relevance
“…It is apparent from the table that the EFA revealed four factors solutions with eigenvalues greater than 1.00. All the communalities were greater than the threshold of 0.60 (Weaver & Maxwell, 2014) and the items were loaded onto its factor. However, one item of brand trust (BT2) was deleted from the analysis because of significant cross‐loading (Siddiqui, 2015).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is apparent from the table that the EFA revealed four factors solutions with eigenvalues greater than 1.00. All the communalities were greater than the threshold of 0.60 (Weaver & Maxwell, 2014) and the items were loaded onto its factor. However, one item of brand trust (BT2) was deleted from the analysis because of significant cross‐loading (Siddiqui, 2015).…”
Section: Resultsmentioning
confidence: 99%
“…However, one item of brand trust (BT2) was deleted from the analysis because of significant cross‐loading (Siddiqui, 2015). Moreover, Kaiser–Meyer–Olkin was greater than the threshold of 0.70 and Bartlett's test was significant at p < .001, suggesting a good fit (Siddiqui, 2015; Weaver & Maxwell, 2014).…”
Section: Resultsmentioning
confidence: 99%
“…Unidimensinality of the model is achieved by deleting all factor loading < 0.6 and convergent validity of the model is achieved using Average Variance Extracted (AVE) for every construct to be ≥ 0.5 as recommended by different researchers (Teo, 2010;Weaver & Maxwell, 2014) shown in Table 1 below. Composite reliability (CR) and internal consistency of the model were assessed and achieved with CR ≥ 0.6 as suggested by (Zainudin, 2014).…”
Section: Resultsmentioning
confidence: 99%
“…Akbaş ve Tavşancıl (2015) farklı kayıp veri oranları ve farklı örneklem büyüklüğü koşulları altında beş atama yönteminin temel bileşenler üzerindeki etkisini incelemiş ve çoklu atama ve beklenti maksimizasyonu yöntemlerinin çoğunlukla iyi sonuçlar verdiğini belirtmişlerdir. Sürekli veri setlerine dayalı olarak Weaver ve Maxwell (2014), temel eksenler yöntemine dayalı açımlayıcı faktör analizlerinde kayıp veriler için silme ve ortalama atama yöntemleri üzerine SPSS uygulama örnekleri gösterirken Josse ve Husson (2012) kayıp verilere çoklu atama ve beklenti maksimizasyonu yöntemlerinin temel bileşenler analizi üzerindeki etkisini incelemişlerdir. Lorenzo-seva ve van Ginkel (2016), çok boyutlu yapılarda farklı algoritmalar kullanan çoklu atama yöntemlerinin ağırlıklandırılmamış en küçük kareler yöntemine dayalı açımlayıcı faktör analizi üzerindeki etkisini araştırmışlardır.…”
Section: Introductionunclassified