2016
DOI: 10.3847/0067-0049/225/2/31
|View full text |Cite
|
Sign up to set email alerts
|

Photometric Supernova Classification With Machine Learning

Abstract: Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
255
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 207 publications
(260 citation statements)
references
References 68 publications
(75 reference statements)
4
255
1
Order By: Relevance
“…For a smaller sample training set of 5.2% of all the data we again perform similarly to Karpenka et al (2013), but underperform compared to Newling et al (2011), taking into account the slightly larger sample size in the latter case. In Lochner et al (2016), using the SALT2 fits provided the best average AUC over a range of machinelearning techniques. By imposing a purity of 90%, a completeness of 85% was achieved, while requiring a completeness of 90% reveals a corresponding purity of 85%.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…For a smaller sample training set of 5.2% of all the data we again perform similarly to Karpenka et al (2013), but underperform compared to Newling et al (2011), taking into account the slightly larger sample size in the latter case. In Lochner et al (2016), using the SALT2 fits provided the best average AUC over a range of machinelearning techniques. By imposing a purity of 90%, a completeness of 85% was achieved, while requiring a completeness of 90% reveals a corresponding purity of 85%.…”
Section: Resultsmentioning
confidence: 99%
“…The analyses by Karpenka et al (2013) and Newling et al (2011) are easier to compare. Along with Lochner et al (2016), these employ a two-step process, where features are first extracted by various methods before machinelearning classification. The results obtained for similar sized training sets are comparable, as can be seen in the top section of Table 2.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Machine learning techniques can often outperform these classifiers, yielding higher SN Ia classification efficiency (the fraction of SNe Ia classified correctly) and lower CC SN contamination (Lochner et al 2016;Möller et al 2016). On SDSS-SN data, the Sako et al (2014) kd-tree nearest neighbor (NN) method has a purity comparable to that of Campbell et al (2013) but accurately classifies ∼1.4 times as many real SNe Ia in a given sample.…”
Section: Introductionmentioning
confidence: 99%