2019
DOI: 10.1016/j.neucom.2018.11.087
|View full text |Cite
|
Sign up to set email alerts
|

An application of hierarchical Gaussian processes to the detection of anomalies in star light curves

Abstract: This study is concerned with astronomical time-series called light-curves, that represent the brightness of celestial objects over a period of time. We consider the task of finding anomalous light-curves of periodic variable stars. We employ a Hierarchical Gaussian Process to create a general and stable model of time-series for anomaly detection, and apply this approach to the light-curve problem. Hierarchical Gaussian Processes require only a few additional parameters compared to conventional Gaussian Process… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 27 publications
(44 reference statements)
0
13
0
Order By: Relevance
“…To achieve this, we used HLGPR. This builds upon prior work using hierarchical GPs to directly model observed time series, both in MEG work ( 13 ) and other fields ( 58 , 59 ), and provides an intuitive approach to modeling time series data while accounting for its covariance structure. This is particularly appealing for the analysis performed here, as directly accounting for covariance allows us to circumvent multiple comparison corrections across time points, which presents a problem when identifying significant effects if we expect these effects to be brief in their duration.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To achieve this, we used HLGPR. This builds upon prior work using hierarchical GPs to directly model observed time series, both in MEG work ( 13 ) and other fields ( 58 , 59 ), and provides an intuitive approach to modeling time series data while accounting for its covariance structure. This is particularly appealing for the analysis performed here, as directly accounting for covariance allows us to circumvent multiple comparison corrections across time points, which presents a problem when identifying significant effects if we expect these effects to be brief in their duration.…”
Section: Methodsmentioning
confidence: 99%
“…This approach allows us to take advantage of the benefits of hierarchical models in terms of aiding parameter estimation when using data that are limited in quantity and quality, as is typical of neuroimaging data. To achieve this, we turn to prior work on hierarchical GPs ( 58 , 59 ) and use a hierarchical mean and covariance structure to define the GPs used to represent regression coefficients. For each regressor, we first define a constant mean group-level GP representing the group mean time-varying regression weights.…”
Section: Methodsmentioning
confidence: 99%
“…To achieve this, we used hierarchical Bayesian latent Gaussian process regression (HLGPR). This builds upon prior work using hierarchical Gaussian processes (GPs) to 607 directly model observed timeseries, both in MEG work 13 and other fields 60,61 , and provides an intuitive approach to modelling timeseries data while accounting for its covariance structure. This is particularly appealing for the analysis performed here, as directly accounting for covariance allows us to circumvent multiple comparison corrections across time points, which presents a problem when identifying significant effects if we expect these effects to be brief in their duration.…”
Section: Statistical Analysis Of Sequenceness Datamentioning
confidence: 99%
“…This approach allows us to take advantage of the benefits of hierarchical models in terms of aiding parameter estimation when using data that is limited in quantity and quality, as is typical of neuroimaging data. To achieve this, we turn to prior work on hierarchical GPs 60,61 , and use a hierarchical mean and covariance structure to define the GPs used to represent regression coefficients. For each regressor, we first define a constant-mean group-level GP representing the group mean time-varying regression weights.…”
mentioning
confidence: 99%
“…It has strong reasoning ability, good interpretability, and exhibits strong adaptability in case of issues such as high dimensions, small samples, and nonlinearity. Furthermore, it has the advantages of no function constraint, adaptive acquisition of model parameters, and probabilistic output, which is a research hotspot in the field of machine learning [18]. The conjugate gradient method is generally used to determine the optimal hyper-parameters of the Gauss process [19].…”
Section: Introductionmentioning
confidence: 99%