1998
DOI: 10.1080/07474939808800410
|View full text |Cite
|
Sign up to set email alerts
|

Estimating mixtures of normal distributions via empirical characteristic function

Abstract: This paper uses the empirical characteristic function (ECF) procedure to estimate the parameters of mixtures of normal distributions. Since the characteristic function is uniformly bounded, the procedure gives estimates that are numerically stable. It is shown that, using Monte Carlo simulation, the finite sample properties of th ECF estimator are very good, even in the case where the popular maximum likelihood estimator fails to exist. An empirical application is illustrated using the monthl excess return of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0

Year Published

2002
2002
2015
2015

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(22 citation statements)
references
References 23 publications
1
21
0
Order By: Relevance
“…The ECF procedure in the iid case has been previously investigated by Paulson et al (1975), Heathcote (1977), Feuerverger and Mureika (1977), Bryant and Paulson (1983), Feuerverger and McDunnough (1981b,c), Koutrouvelis (1980), and more recently by Tran (1998) and . The justification for the ECF method is that the CF is the Fourier-Stietjes transform of the cumulative distribution function (CDF) and hence there is a one-one correspondence between the CF and CDF.…”
Section: Iid Casementioning
confidence: 99%
See 1 more Smart Citation
“…The ECF procedure in the iid case has been previously investigated by Paulson et al (1975), Heathcote (1977), Feuerverger and Mureika (1977), Bryant and Paulson (1983), Feuerverger and McDunnough (1981b,c), Koutrouvelis (1980), and more recently by Tran (1998) and . The justification for the ECF method is that the CF is the Fourier-Stietjes transform of the cumulative distribution function (CDF) and hence there is a one-one correspondence between the CF and CDF.…”
Section: Iid Casementioning
confidence: 99%
“…Consequently, a finite set of moment conditions or a continuum of moment conditions can be constructed, depending how the transform variable r is chosen. If r is chosen to be a set of discrete points, the procedure is called the discrete ECF method and is used by Tran (1998) to estimate the mixtures of normal distributions, following the suggestion made by Quandt and Ramsey (1978) and Schmidt (1982).…”
Section: Order Reprintsmentioning
confidence: 99%
“…There are methods of estimating α and the rest of the parameters by using the empirical characteristic function. The main idea of this methodology consists in minimizing the distance between the characteristic function (CF) and the empirical characteristic function (ECF) in an appropriate norm [1,10,28]. Let us denote by ψ the characteristic function of a stable distribution and bŷ…”
Section: Parameters Of Univariate Stable Distributionsmentioning
confidence: 99%
“…However, two major problems arise: one is the choice of the size of the grid points; the other is the "optimal" (if it exists) distance among those grid points 2 . As Tran (1998) and Knight and Yu (2002) documented, these two problems are difficult to handle in practice.…”
Section: Introductionmentioning
confidence: 99%
“…It is well-known that this condition is not always satisfied in the MN case 1 and thus ML estimation may break down in practice. The difficulties in the ML approach have sparked considerable interest in searching for alternative estimation methods such as Method of Moments (MOM) in Cohen (1967) and Day (1969), method of Moment Generating Function (MGF) in Quant and Ramsey (1978) and Schmidt (1982), method of Discrete Empirical Characteristic Function (DECF) in Tran (1998). This class of estimation methods, in essence, minimize the distance between the theoretical components (Moment, MGF or CF) from the model and their empirical counterparts from the data over a set of fixed grid points.…”
Section: Introductionmentioning
confidence: 99%