The problem of quickest detection of a change in the distribution of a n × p random matrix based on a sequence of observations having a single unknown change point is considered. The forms of the pre-and post-change distributions of the rows of the matrices are assumed to belong to the family of elliptically contoured densities with sparse dispersion matrices but are otherwise unknown. We propose a non-parametric stopping rule that is based on a novel summary statistic related to k-nearest neighbor correlation between columns of each observed random matrix. In the large scale regime of p → ∞ and n fixed we show that, among all functions of the proposed summary statistic, the proposed stopping rule is asymptotically optimal under a minimax quickest change detection (QCD) model.
This paper addresses the problem of quickest detection of a change in the maximal coherence between columns of a n × p random matrix based on a sequence of matrix observations having a single unknown change point. The random matrix is assumed to have identically distributed rows and the maximal coherence is defined as the largest of the p 2 correlation coefficients associated with any row. Likewise the k nearest neighbor (kNN) coherence is defined as the k-th largest of these correlation coefficients. The forms of the pre-and post-change distributions of the observed matrices are assumed to belong to the family of elliptically contoured densities with sparse dispersion matrices but are otherwise unknown. A non-parametric stopping rule is proposed that is based on the maximal k-nearest neighbor sample coherence between columns of each observed random matrix. This is a summary statistic that is related to a test of existence of a hub vertex in a sample correlation graph having degree at least k. Performance bounds on the delay and false alarm performance of the proposed stopping rule are obtained in the purely high dimensional regime where p → ∞ and n is fixed. When the pre-change dispersion matrix is diagonal it is shown that, among all functions of the proposed summary statistic, the proposed stopping rule is asymptotically optimal under a minimax quickest change detection (QCD) model as the stopping threshold approaches infinity. The theory developed also applies to sequential hypothesis testing and fixed sample size tests.
Abstract. The problem of removing white zero-mean Gaussian noise from an image is an interesting inverse problem to be investigated in this paper through sparse and redundant representations. However, finding the sparsest possible solution in the noise scenario was of great debate among the researchers. In this paper we make use of new approach to solve this problem and show that it is comparable with the state-of-art denoising approaches.
This paper proposes a general adaptive procedure for budget-limited predictor design in high dimensions called two-stage Sampling, Prediction and Adaptive Regression via Correlation Screening (SPARCS). SPARCS can be applied to high dimensional prediction problems in experimental science, medicine, finance, and engineering, as illustrated by the following. Suppose one wishes to run a sequence of experiments to learn a sparse multivariate predictor of a dependent variable Y (disease prognosis for instance) based on a p dimensional set of independent variables X = [X 1 , . . . , X p ] T (assayed biomarkers). Assume that the cost of acquiring the full set of variables X increases linearly in its dimension. SPARCS breaks the data collection into two stages in order to achieve an optimal tradeoff between sampling cost and predictor performance. In the first stage we collect a few (n) expensive, at the full dimension p n of X, winnowing the number of variables down to a smaller dimension l < p using a type of cross-correlation or regression coefficient screening. In the second stage we collect a larger number (t − n) of cheaper samples of the l variables that passed the screening of the first stage. At the second stage, a low dimensional predictor is constructed by solving the standard regression problem using all t samples of the selected variables. SPARCS is an adaptive online algorithm that implements false positive control on the selected variables, is well suited to small sample sizes, and is scalable to high dimensions. We establish asymptotic bounds for the Familywise Error Rate (FWER), specify high dimensional convergence rates for support recovery, and establish optimal sample allocation rules to the first and second stages.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.