2022
DOI: 10.48550/arxiv.2206.08555
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SOS: Score-based Oversampling for Tabular Data

Abstract: Score-based generative models (SGMs) are a recent breakthrough in generating fake images. SGMs are known to surpass other generative models, e.g., generative adversarial networks (GANs) and variational autoencoders (VAEs). Being inspired by their big success, in this work, we fully customize them for generating fake tabular data. In particular, we are interested in oversampling minor classes since imbalanced classes frequently lead to sub-optimal training outcomes. To our knowledge, we are the first presenting… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…OCT-GAN (Kim et al, 2021) is a generative model design based on neural ODEs. SOS (Kim et al, 2022) proposed a style-transfer-based oversampling method for imbalanced tabular data using SGMs, whose main strategy is converting a major sample to a minor sample. Since its task is not compatible to our task to generate from the scratch, direct comparisons are not possible.…”
Section: Tabular Data Synthesismentioning
confidence: 99%
See 2 more Smart Citations
“…OCT-GAN (Kim et al, 2021) is a generative model design based on neural ODEs. SOS (Kim et al, 2022) proposed a style-transfer-based oversampling method for imbalanced tabular data using SGMs, whose main strategy is converting a major sample to a minor sample. Since its task is not compatible to our task to generate from the scratch, direct comparisons are not possible.…”
Section: Tabular Data Synthesismentioning
confidence: 99%
“…Since STaSy can be converted to an oversampling method following the design guidance of SOS, we conduct oversampling experiments with STaSy to compare with SOS (Kim et al, 2022). We train, for fair comparison, STaSy w/o fine-tuning for each minor class and generate minority samples to be the same size of the majority class.…”
Section: B Comparison Between Sos and Stasy For The Oversampling Taskmentioning
confidence: 99%
See 1 more Smart Citation