2005
DOI: 10.1081/sta-200031471
|View full text |Cite
|
Sign up to set email alerts
|

Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

Abstract: This paper proposes predictive inference for the multiple regression model with independent normal errors. The distributions of the sample regression vector (SRV) and the residual sum of squares (RSS) for the model are derived by using invariant differentials. Also the predictive distributions of the future regression vector (FRV) and the future residual sum of squares (FRSS) for the future regression model are obtained. Conditional on the realized responses, the future regression vector is found to follow a m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 16 publications
(11 citation statements)
references
References 13 publications
0
11
0
Order By: Relevance
“…All the above studies deal with the prediction distribution of future responses. However, Khan (2004) proposed prediction distributions for the future regression vector (FRV) and future residual sum of squares (FRSS). Here we pursue the same approach to find the optimal β-expectation tolerance regions for the FRV and FRSS using the distribution of appropriate future statistics.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…All the above studies deal with the prediction distribution of future responses. However, Khan (2004) proposed prediction distributions for the future regression vector (FRV) and future residual sum of squares (FRSS). Here we pursue the same approach to find the optimal β-expectation tolerance regions for the FRV and FRSS using the distribution of appropriate future statistics.…”
Section: Introductionmentioning
confidence: 99%
“…The two sets of responses are connected through the common, shape, regression and scale parameters. Following Khan (2004), we pursue the Bayesian approach to derive the distribution of the FRV and FRSS for the future responses, conditional on a set of realized responses. This is a new development that deals with the predictive inference for the future regression parameters, rather than that of the future responses.…”
Section: Introductionmentioning
confidence: 99%
“…MODEL OPTIMIZATION AND RESULTS In this section, the main optimization goal is to minimize an error. The residual sum of squares (RSS; also known as sum of squared error, SSE) [11] metric has been chosen to measure error value. It is calculated as a sum of error term values from each sample:…”
Section: Model Constructionmentioning
confidence: 99%
“…To test the null hypothesis on the intercept vector H 0 : β 0 = β 00 (given known vector) against H a : β 0 > β 00 in the multivariate simple regression model (ii) The cdf for m=10, n=20, ρ=0.5, θ j =1.5, j, k=1,2, j ≠ k (for details, see Khan, 2006) when there is non-sample prior information on the slope vector β 1 , the test statistic follows a correlated bivariate F distribution. The ultimate test on H 0 is called the pre-test test (PTT) because it depends on the outcome of the pre-test on the suspected slope; that is, H * 0 = β 1 = β 10 (cf.…”
Section: Application To the Power Function Of The Pttmentioning
confidence: 99%