2017
DOI: 10.1080/00949655.2017.1308510
|View full text |Cite
|
Sign up to set email alerts
|

Interval-censored unimodal kernel density estimation via data sharpening

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…Another method which applies a "data sharpening"-based approach to interval-censored data is also highly dependent on bandwidth selection, with suboptimal default performance in a nontrivial number of scenarios. 40 Overall, our proposed procedure contributes to the field of nonparametric survival analysis by achieving improved accuracy and predictive ability, especially when available survival data is limited by censoring, small sample size, low prevalence, or few repeated measurements per subject. We have demonstrated that smoothing NPMLE estimates by complexity-based penalized likelihood is a simple but powerful approach for improving survival & density estimation as well as exact time-to-event prediction (imputation).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Another method which applies a "data sharpening"-based approach to interval-censored data is also highly dependent on bandwidth selection, with suboptimal default performance in a nontrivial number of scenarios. 40 Overall, our proposed procedure contributes to the field of nonparametric survival analysis by achieving improved accuracy and predictive ability, especially when available survival data is limited by censoring, small sample size, low prevalence, or few repeated measurements per subject. We have demonstrated that smoothing NPMLE estimates by complexity-based penalized likelihood is a simple but powerful approach for improving survival & density estimation as well as exact time-to-event prediction (imputation).…”
Section: Discussionmentioning
confidence: 99%
“…One drawback in their proposed method for choosing a smoothing bandwidth, which uses the computationally expensive process of cross‐validation and can result in a poor choice of bandwidth. Another method which applies a “data sharpening”‐based approach to interval‐censored data is also highly dependent on bandwidth selection, with suboptimal default performance in a nontrivial number of scenarios 40 …”
Section: Discussionmentioning
confidence: 99%
“…Convergence of proposed algorithms is studied. In [4] data sharpening is proposed to increase robustness of a kernel density estimator to bandwidth misspecification and measurement errors. [19] used a maximum smoothed likelihood approach and a smoothing the (discrete) MLE of the distribution function approach.…”
Section: Introductionmentioning
confidence: 99%