1997
DOI: 10.2307/1244291
|View full text |Cite
|
Sign up to set email alerts
|

Development of Statistical Discriminant Mathematical Programming Model Via Resampling Estimation Techniques

Abstract: This paper uses resampling estimation techniques to develop a statistical mathematical programming model for discriminant analysis problems. Deleted-d jackknife, deleted-d bootstrap, and bootstrap procedures are used to identify statistical significant parameter estimates for a discriminant mathematical programming (MP) model. The results of this paper indicate that the resampling approach is a viable model selection technique. Furthermore, estimating the MP models via resampling techniques can also improve th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

1999
1999
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…Sub-sample jackknife technique was initially proposed by Wu (1990) and extended by Politis and Romano (1994). In the 1990s, researchers including Politis et al (1997), and Ziari et al (1997) used these re-sampling techniques for their empirical work. Compared to standard jackknife, in subsample jackknife more than one observation is dropped to estimate out-of-sample forecasts of the remaining m ¼ nÀd observations, where n is the total number of observations, and d ¼ 2, 3, .…”
Section: Jackknife Out-of-sample Forecastsmentioning
confidence: 99%
“…Sub-sample jackknife technique was initially proposed by Wu (1990) and extended by Politis and Romano (1994). In the 1990s, researchers including Politis et al (1997), and Ziari et al (1997) used these re-sampling techniques for their empirical work. Compared to standard jackknife, in subsample jackknife more than one observation is dropped to estimate out-of-sample forecasts of the remaining m ¼ nÀd observations, where n is the total number of observations, and d ¼ 2, 3, .…”
Section: Jackknife Out-of-sample Forecastsmentioning
confidence: 99%
“…Future research is needed to compare trained models on completely separate data sets. Although some theories have been presented on sampling methods, these are still in their infancy (14).…”
Section: Discussionmentioning
confidence: 99%
“…Leave-one-out gave similar variance to the other resampling methods but cross-validation. The leave-one-out procedure assures that, regardless of the sample size, the relevant observation would not be in the pseudo-optimal solutions more than once (14). This resampling technique can easily produce significantly different results in NN settings, depending on the training-stopping criterion (15).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…• The results obtained by mathematical programming models are easily interpreted. [Ziari et al (1997)]. As Baesens (2003) comments, it is very important to have not just robust and powerful scorecards but also scorecards whose results can be interpreted easily by its users.…”
Section: Introductionmentioning
confidence: 99%