1973
DOI: 10.2307/2334923
|View full text |Cite
|
Sign up to set email alerts
|

The Bhattacharyya Matrix for the Mixture of Two Distributions

Abstract: JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.. Biometrika Trust is collaborating with JSTOR to digitize, preserve and extend access to Biometrika.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
2
0

Year Published

1990
1990
1990
1990

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…However, if the Bhattacharyya matrix is diagonal or non-diagonal elements of the matrix are equal, then one can invert it to obtain the corresponding bounds. In discussing the Bhattacharyya matrix for the mixture of two distributions Whittaker (1973) considered the estimation of the parameter 0 in the mixture distribution wheref, and f2 are such that they differ at each point of some set of positive Lebesgue measure (or of positive counting measure in the case of discrete case). Hill (1963) ~ showed that the Cramer-Rao minimum variance bound for an unbiased estimate of 0 for a sample of observations from the above mixture distributive equal 1 where Z(O> = f,(x)X(x) dx and it lies in ( 0~1 ) 0f,(x> + ( 1 -0lf2(~)…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…However, if the Bhattacharyya matrix is diagonal or non-diagonal elements of the matrix are equal, then one can invert it to obtain the corresponding bounds. In discussing the Bhattacharyya matrix for the mixture of two distributions Whittaker (1973) considered the estimation of the parameter 0 in the mixture distribution wheref, and f2 are such that they differ at each point of some set of positive Lebesgue measure (or of positive counting measure in the case of discrete case). Hill (1963) ~ showed that the Cramer-Rao minimum variance bound for an unbiased estimate of 0 for a sample of observations from the above mixture distributive equal 1 where Z(O> = f,(x)X(x) dx and it lies in ( 0~1 ) 0f,(x> + ( 1 -0lf2(~)…”
mentioning
confidence: 99%
“…According to Whittaker (1973), the Bhattacharyya matrix for the mixture distribution is given by: Characterization theory and the mixture model 267 normal-normal model, since we are taking mean and variance of fl and fi to be the same, so that the mixture off, and f2 will again be a normal distribution. Also from Kanji (1985), fig.…”
mentioning
confidence: 99%