1999
DOI: 10.1051/aas:1999254
|View full text |Cite
|
Sign up to set email alerts
|

Spectral analysis of stellar light curves by means of neural networks

Abstract: Abstract. Periodicity analysis of unevenly collected data is a relevant issue in several scientific fields. In astrophysics, for example, we have to find the fundamental period of light or radial velocity curves which are unevenly sampled observations of stars. Classical spectral analysis methods are unsatisfactory to solve the problem. In this paper we present a neural network based estimator system which performs well the frequency extraction in unevenly sampled signals. It uses an unsupervised Hebbian nonli… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0
1

Year Published

1999
1999
2007
2007

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 30 publications
(28 citation statements)
references
References 20 publications
0
27
0
1
Order By: Relevance
“…Also, tests carried out by the Authors of this method (Tagliaferri et al 1999) have shown that low-frequency drifts of the baseline flux of sources do not affect the derived periodicities.…”
Section: Autocorrelation Matrix Based Analysismentioning
confidence: 95%
See 1 more Smart Citation
“…Also, tests carried out by the Authors of this method (Tagliaferri et al 1999) have shown that low-frequency drifts of the baseline flux of sources do not affect the derived periodicities.…”
Section: Autocorrelation Matrix Based Analysismentioning
confidence: 95%
“…In a previous paper (Tagliaferri et al 1999, hereafter T99) some of us introduced the so called STIMA algorithm, based on a particular type of the MUlty SIgnal Classifier (MUSIC) (Oppennheim & Schafer 1965) estimator specifically tailored to work with unevenly sampled data and on a robust nonlinear PCA Neural Network used to extract the principal components of the autocorrelation matrix of the input sources. Without entering into details (which may be found in T99) we now briefly summarize its main features.…”
Section: Autocorrelation Matrix Based Analysismentioning
confidence: 99%
“…[7,10,23], examine the light curves of variable stars (see e.g. [3,6,22]) and make morphological classification of galaxies (see e.g. [2,11,21]).…”
Section: Introductionmentioning
confidence: 99%
“…In our approach we extract the features by using an approach based on a non-linear PCA method that permit to extract the eigenvectors directly from the unevenly sampled data. The approach is based on a novel periodicity estimator, STIMA algorithm, described in [5], [17], [18].…”
Section: Paranmeter Intcidence Of Latent Variable Computationmentioning
confidence: 99%
“…The features extraction process can be divided in the following two steps: * Preprocessing: we first calculate and subtract the average pattern to obtain zero mean process; * Neural computing: the fundamental learning parameters are: i) the initial W weight matrix; ii) the number of neurons, which is the number p of principal eigenvectors that we need; iii) y, the nonlinear learning function parameter; iv) the learning rate p; v) early stopping parmeter £ (see [5], [17], [18] for a different early stopping); We then initialize the weight matrix W assigning the classical small random values. Otherwise we can use the first patterns of the signal as the columns of the matrix.…”
Section: B Non-linear Pca Based Feature Extractormentioning
confidence: 99%