2021
DOI: 10.48550/arxiv.2105.11169
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Light curve classification with recurrent neural networks for GOTO: dealing with imbalanced data

U. F. Burhanudin,
J. R. Maund,
T. Killestein
et al.

Abstract: The advent of wide-field sky surveys has led to the growth of transient and variable source discoveries. The data deluge produced by these surveys has necessitated the use of machine learning (ML) and deep learning (DL) algorithms to sift through the vast incoming data stream. A problem that arises in real-world applications of learning algorithms for classification is imbalanced data, where a class of objects within the data is underrepresented, leading to a bias for over-represented classes in the ML and DL … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 53 publications
0
2
0
Order By: Relevance
“…Connections between nodes form a directed graph along a temporal sequence. Dékány & Grebel (2020), Fremling et al (2021, Burhanudin et al (2021) Recurrent AutoEncoder (RAE) Its encoder learns a representation (encoding) for a set of data, typically for dimensional reduction (2D time sequences to 1D latent variables), by training the network to extract the inherent features from input sequences, and ignore insignificant or noisy data. Then, the representation, or so-called 1D latent variables are fed into a decoder, which decodes the features and generate output sequences.…”
Section: Model Selection and Caveatsmentioning
confidence: 99%
See 1 more Smart Citation
“…Connections between nodes form a directed graph along a temporal sequence. Dékány & Grebel (2020), Fremling et al (2021, Burhanudin et al (2021) Recurrent AutoEncoder (RAE) Its encoder learns a representation (encoding) for a set of data, typically for dimensional reduction (2D time sequences to 1D latent variables), by training the network to extract the inherent features from input sequences, and ignore insignificant or noisy data. Then, the representation, or so-called 1D latent variables are fed into a decoder, which decodes the features and generate output sequences.…”
Section: Model Selection and Caveatsmentioning
confidence: 99%
“…We implement a stochastic recurrent neural network (SRNN) to model quasar light curves and recover the DRW and DHO model parameters by Gaussian Process Regression (GPR). Recurrent neural networks (RNN) are a popular class of ML connectionist models for sequential modelling and have been used previously in astrophysics applications (e.g., Charnock & Moss 2017;Naul et al 2018;Hinners et al 2018;Muthukrishna et al 2019;Möller & de Boissière 2020 Escamilla-Rivera et al 2020;Burhanudin et al 2021;Lin & Wu 2021). However, as noted in Yin & Barucca (2021), one limitation of RNNs is that the hidden state transition function is entirely deterministic, which can limit the RNNs ability to model processes with high variability.…”
mentioning
confidence: 99%